-
Notifications
You must be signed in to change notification settings - Fork 3k
feature request: force install from cache. #2568
Comments
Hm. This used to work as |
Oh, you can also set the
|
thanks! |
Is no-registry going to be fixed? cache-min still appears to ping the registry to ask what the times are and that's undesirable? |
Ack, indeed, this was broken with the npm-registry-client refactor. Fixed on 81fa33a. |
Cool thanks :-) |
Also, --no-registry will be fixed on the next npm version (included in node 0.8.3 and 0.9.0) |
There is no mention of |
This is still broken. |
Any word on the status of this flag? |
by the way, this works with npmd if you use |
I've seen this fail using a command like this:
But this succeeds:
That's perfect for my use case but I don't know if it's how it's intended to work. This was with version 1.2.18. |
@grahamlyons you are using quite an old version of npm. yes, installing from a file path needs to point to a directory. |
@grahamlyons in npm v1.3.17, the following fails now: As does: |
This is in my notes from JSConf EU as something we are worse at than bower. We should make it more reliable to install while offline. But, probably as part of the big ol' cache refactor. |
@domenic I'll help if I can :) (I'm trying to refactor @isaacs Btw-- So, for our friends who might be googling this in the future: Install npm dependencies from cache (i.e. offline, no connection)
CLInpm install ejs sails-disk optimist grunt --cache-min 999999999 Programmaticallyvar npm = require('npm');
var modules = ['ejs', 'grunt', 'sails-disk', 'optimist'];
npm.load({
loglevel: 'silent',
'cache-min': 999999999999
}, function(err) {
if (err) throw err;
console.time('npm install');
npm.commands.install(modules, function(err, data) {
if (err) throw err;
console.timeEnd('npm install');
});
}); |
So ran into one more problem-- including https://npmjs.org/package/enpeem It's a very thin wrapper around using Of course, the jackpot would be if npm could be |
Would like this feature for |
An |
In my case, setting cache-min had no effect. A locally installed karma was causing |
I've tried to advocate the use of |
It should reduce a lot of time if |
Fixes: #2568 Fixes: #2649 Fixes: #3141 Fixes: #4042 Fixes: #4652 Fixes: #5357 Fixes: #5509 Fixes: #5622 Fixes: #5941 All fetching-related networking is now done through pacote, and the old cache has been entirely replaced by a cacache-based one. Features: * npm now supports a variety of hash algorithms for tarball storage. On registries that support it, npm is able to use sha512sum for verification. * An `integrity` field has been added to `npm-shrinkwrap.json`. * Package integrity will be fully verified on both cache insert and extraction -- if npm installs something, it's going to be exactly what you downloaded, byte-for-byte, or it will fail. * If `npm-shrinkwrap.json` is used, npm will bypass checking package manifests and go straight to the tarball, fetching it by content address if locally cached. * Checksum integrity failures will now retry downloading on error, instead of failing on a single check. * A new npm command, `npm cache verify`, can now be used to verify and garbage collect your local cache. * npm now supports arbitrarily large tarball downloads: tarballs will no longer be loaded entirely into memory before extraction. * packages whose names only differ in casing, and packages from different sources/registries/etc will now correctly be cached separately from each other. * Some performance improvements. * Improved fetch retry logic will try harder to download your packages. BREAKING CHANGE: many shrinkwrap and cache-related things have changed. * Previously-created caches will no longer be used. They will be left in place, but data will need to be re-cached. There is no facility for rebuilding a cache based on an existing one. * `npm cache ls` has been removed for now * `npm cache rm` now always removes the entire cache. There is no granular removal available for now. * git dependencies can now use semver resolution using `#semver:^1.2.3` * `--cache-min` and `--cache-max` have been deprecated. Use `--offline`, `--prefer-offline`, and `--prefer-online instead. `--cache-min=9999+` and `--cache-max=0` have been aliased to `--prefer-offline` and `--prefer-online`, respectively. * npm will now obey HTTP caching headers sent from registries and other remote HTTP hosts, and will use standard HTTP caching rules for its local cache. * `prepublishOnly` now runs *before* packing the tarball. * npm no longer supports node@<4.
Fixes: #2568 Fixes: #2649 Fixes: #3141 Fixes: #4042 Fixes: #4652 Fixes: #5357 Fixes: #5509 Fixes: #5622 Fixes: #5941 All fetching-related networking is now done through pacote, and the old cache has been entirely replaced by a cacache-based one. Features: * npm now supports a variety of hash algorithms for tarball storage. On registries that support it, npm is able to use sha512sum for verification. * An `integrity` field has been added to `npm-shrinkwrap.json`. * Package integrity will be fully verified on both cache insert and extraction -- if npm installs something, it's going to be exactly what you downloaded, byte-for-byte, or it will fail. * If `npm-shrinkwrap.json` is used, npm will bypass checking package manifests and go straight to the tarball, fetching it by content address if locally cached. * Checksum integrity failures will now retry downloading on error, instead of failing on a single check. * A new npm command, `npm cache verify`, can now be used to verify and garbage collect your local cache. * npm now supports arbitrarily large tarball downloads: tarballs will no longer be loaded entirely into memory before extraction. * packages whose names only differ in casing, and packages from different sources/registries/etc will now correctly be cached separately from each other. * Some performance improvements. * Improved fetch retry logic will try harder to download your packages. BREAKING CHANGE: many shrinkwrap and cache-related things have changed. * Previously-created caches will no longer be used. They will be left in place, but data will need to be re-cached. There is no facility for rebuilding a cache based on an existing one. * `npm cache ls` has been removed for now * `npm cache rm` now always removes the entire cache. There is no granular removal available for now. * git dependencies can now use semver resolution using `#semver:^1.2.3` * `--cache-min` and `--cache-max` have been deprecated. Use `--offline`, `--prefer-offline`, and `--prefer-online instead. `--cache-min=9999+` and `--cache-max=0` have been aliased to `--prefer-offline` and `--prefer-online`, respectively. * npm will now obey HTTP caching headers sent from registries and other remote HTTP hosts, and will use standard HTTP caching rules for its local cache. * `prepublishOnly` now runs *before* packing the tarball. * npm no longer supports node@<4.
Fixes: #2568 Fixes: #2649 Fixes: #3141 Fixes: #4042 Fixes: #4652 Fixes: #5357 Fixes: #5509 Fixes: #5622 Fixes: #5941 All fetching-related networking is now done through pacote, and the old cache has been entirely replaced by a cacache-based one. Features: * npm now supports a variety of hash algorithms for tarball storage. On registries that support it, npm is able to use sha512sum for verification. * An `integrity` field has been added to `npm-shrinkwrap.json`. * Package integrity will be fully verified on both cache insert and extraction -- if npm installs something, it's going to be exactly what you downloaded, byte-for-byte, or it will fail. * If `npm-shrinkwrap.json` is used, npm will bypass checking package manifests and go straight to the tarball, fetching it by content address if locally cached. * Checksum integrity failures will now retry downloading on error, instead of failing on a single check. * A new npm command, `npm cache verify`, can now be used to verify and garbage collect your local cache. * npm now supports arbitrarily large tarball downloads: tarballs will no longer be loaded entirely into memory before extraction. * packages whose names only differ in casing, and packages from different sources/registries/etc will now correctly be cached separately from each other. * Some performance improvements. * Improved fetch retry logic will try harder to download your packages. BREAKING CHANGE: many shrinkwrap and cache-related things have changed. * Previously-created caches will no longer be used. They will be left in place, but data will need to be re-cached. There is no facility for rebuilding a cache based on an existing one. * `npm cache ls` has been removed for now * `npm cache rm` now always removes the entire cache. There is no granular removal available for now. * git dependencies can now use semver resolution using `#semver:^1.2.3` * `--cache-min` and `--cache-max` have been deprecated. Use `--offline`, `--prefer-offline`, and `--prefer-online instead. `--cache-min=9999+` and `--cache-max=0` have been aliased to `--prefer-offline` and `--prefer-online`, respectively. * npm will now obey HTTP caching headers sent from registries and other remote HTTP hosts, and will use standard HTTP caching rules for its local cache. * `prepublishOnly` now runs *before* packing the tarball. * npm no longer supports node@<4. fix(doctor): updated doctor command and its tests
npm5 supports this now through the You can also use |
that was a long and crazy ride, thanks for the update @zkat ❤️ |
Question. So does this now work like maven, where it will compare the
checksum of the cached version found in the local cache against the
checksum in the repository before it wastes any time refreshing the copy in
the cache? For example, if the version hasn’t changed, yet the checksums
were different then it would download a fresh copy, but if the versions
where the same and the checksums were the same then it would avoid the
download? If the versions were different, then I would expect This is
important when you are doing remote builds over a slow connection, because
there’s no point in downloading if the checksums match.
…--
Thanks,
Dan
On Mon, Jun 12, 2017 at 6:17 PM, Kat Marchán ***@***.***> wrote:
npm5 supports this now through the --offline flag, which will error if
the stuff you were trying to install isn't already in the cache.
You can also use --prefer-offline to use the cache as much as possible,
and hit the network only when a dependency cannot otherwise be fulfilled.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#2568 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AQRBCQ8ljsNwiXG4npyobXqqyNyuJHWVks5sDccMgaJpZM4AC1ud>
.
|
@danshome if you have a package-lock.json, yes. npm will not touch the network at all if you have a |
@zkat why is |
@tommedema because I think you're misunderstanding how Again, if you've previously downloaded the package, and the previously downloaded version is also the latest version, you don't download anything at all. What Tarballs are only ever downloaded once if they're already cached. We don't even do 304 checks for those if we have integrity information for them. And yes, this is the case even if you don't have a So to answer your question in bullet points:
|
And for the sake of it, I want to clarify the point of
|
@zkat that makes perfect sense, thanks for the elaborate explanation. Much appreciated |
Option --no-registry seems deprecated or even non supported for ages, while --offline fixed the problem on install task. Issue can be reproduced using: devtool add "npm://registry.npmjs.org;name=epoll;version=latest" bitbake epoll | DEBUG: Executing shell function do_install (...) | npm ERR! argv ".../node" ".../npm" "install" (...) "--production" "--no-registry" | npm ERR! node v6.11.0 | npm ERR! npm v3.10.10 | npm ERR! registry URL is required And also from log file ".../epoll/1.0.0-r0/npmpkg/npm-debug.log": silly mapToRegistry using default registry 41 silly mapToRegistry registry null 42 verbose stack AssertionError: registry URL is required 42 verbose stack at Conf.getCredentialsByURI (.../get-credentials-by-uri.js:8:3) More relevent insights: npm/npm#2568 Signed-off-by: Philippe Coval <philippe.coval@osg.samsung.com>
I love the explanations and the described behavior. I am of course here because it is not what I observe.
And it does the same big, slow, download when I repeat the command, with or without the --prefer-offline Is there a setting that I missed to enable this awesome caching behavior that I have desperately wanted for years? |
In the case of Puppeteer, you'd have to pass in PUPPETEER_SKIP_CHROMIUM_DOWNLOAD as an environment value (either setting it globally or that particular project).
This is noted in their install.js file here: https://github.com/GoogleChrome/puppeteer/blob/master/install.js#L19
What would be nice is if Puppeteer's installation script had a way of observing / respecting this when it was run.
|
Just highlighting @ackalker, that's an issue of both Puppeteer not respecting/checking for NPM/Yarn offline checks and you potentially not checking the docs for Puppeteer ;) https://github.com/GoogleChrome/puppeteer/blob/master/docs/api.md#environment-variables |
:) Thank you. I did not check the docs well enough. I really wanted the npm behavior for all things installed with npm ;) - ( where puppeteer would check for a new version of chrome and pull it if it's newer than the cached version). I doesn't look like the environment variable offers that kind of flexibility - it's all or nothing. Terribly sorry to bother you with my own failure to read (on a closed issue no less). |
@acklenx puppeteer itself can hook into this behavior by checking for |
is there a way to configure NPM to use some packages from the cache and some packages from the registry? looking to this for local development instead of |
would behave as if there is not network connection.
this would be very useful when the connection is slow,
and you know the modules you are installing well.
it would also create the possibility for a background process that tails npm
and keeps your cache up to date. making installs super fast!
The text was updated successfully, but these errors were encountered: