There is an unsaved comment in progress. You will lose your changes if you continue. Are you sure you want to reopen the work item?

216

Closed

Add an Offline option or smarter cache

description

When NuGet.org was offline today I was in a bad spot. I ended up using the local cache but I doubt that's easy to understand for the average joe. Why not include either a fallback option or an "-offline" choice or both?

file attachments

comments

Like the idea. What about letting the user choose where the cache is placed, so I could put it in dropbox and have it available across dev machines. Don't know if it's possible, but I needed a package I knew was on my home machine, but not on my new work
laptop, so was pretty screwed and ended up downloading the dll's manually. Did not like that ONE BIT! :)

Why is it necessary to rely on nuget.org at all for anything besides updates or new packages? NuGet uses the cache now presumably to not re-download a package it already has. I assume that it's still hitting nuget.org to figure out that when I say I need
SomePackage 1.3.5 it needs SomePackage.1.3.5.nupkg.. so why not just have a "built-in" behvaviour that it looks at the local cache first? This will reduce load on nuget.org, and at the same time give offline support.

In a way I'm suggesting that the local cache folder should always be used as the primary package source (without requiring explicit configuration). This means that even if your source is an internal server, network share, or other private source you still get
offline support.

I develop on disconnected/'never connected' networks frequently (luckily I get a reprieve, but I know many who do not.) Having a way to work offline is critical for inclusion of development technologies on those projects. There will never be a time that
I am connected in those development environments. It may not be optimal in today's 'implicitly expected connected' world, but I soldier up to the plate and deal. Sadly a lot of good tech gets passed over. Imagine life without NuGet... Now please consider raising
the impact level a pinch (not just because I asked nicely!)

I work for the government. (boo! hiss!) Because of this, our development network is NOT connected (in any way) to the outside world. The assumption that everyone is 'always online' is a fallacy; I'd even argue the point in the civilian world. Just because
something can does not mean it will always be 'online'. I recognize that this is a problem throughout the industry, but it is a significantly more serious issue for developers evangelizing the Microsoft stack in a closed environment. Please consider
making this a priority. Thank you!

This definitely deserves some attention. I wouldn't say it needs to be a fallback or an option so much as the cache needs to be the first place checked if a version number is specified. On a slightly related topic, workitem 1614 needs some love too:
http://nuget.codeplex.com/workitem/1614

At my current client site, accessing nuget.org can be rather slow at times and there are even occasional network outages. Being able to use NuGet in an offline mode would be really useful in that context.

I agree with @gregmac. I also think that nuget deserve a server solution that could handle situation like "I have a local nuget server to share enterprise wide packages", "the server retains all
the packages we use, so even if a package disapear on nuget, we have it internally" and "the server works as a proxy, and if a new package is downloaded, the server retains it".

The same reason Visual Studio gives me a choice of local vs. online Help (or a combination of both with the ability to choose which one to try first) is the reason why Visual Studio should provide me a choice of local (which I would assume means cache)
vs. online NuGet via an EXPLICIT, VISIBILE way to choose (or a combination of both with the ability to choose which one to try first). Keep things consistent.

I agree with @gregmac. The only time you should rely on the main feed is when installing or updating packages. And this should be an
explicit action. If you have a CI build server and you are using Package Restore, you
should have a local package source. Your build process should not depend on an external repository. As we saw, this caused a lot of problems.

As another user who works on an Internet-disconnected network it is troubling to see Microsoft release some key updates (EF) only through Nuget. We create a local repository to get around this, but would like to see that as an integral function of the
program rather than requiring reference to rather obscure blog posts.

@simonmsm within a local dir is a bad idea for cache, because it won't help with build servers that do a clean and remove everything before building. There is also no point in caching the same version of the same page 10 times if you happen to have 10
products/versions checked out that all happen to use it.

I am in favor of an 'auto-fail over' that will engage the cached version and poll the NuGet server at interval x, notify user when the live NuGet is back online and check/verify whether the requisite packages have an update.
Personally my package build structure includes an archive folder that could just as well also support what are essentailly 'archive versions of NuGet packages'. User Configurable location would be a good thing. Thanks for listening

I too work on networks that are never connected the the outside world. Whilst NuGet looks interesting, we have not tried it as we believe it insists on a internet connection. Adding offline support would allow this technology to be used by more developers.

@grhm, You can use NuGet without connecting to the Internet, you just need to set up your own private NuGet Server (see:
http://nuget.org/packages/NuGet.Server) and configure NuGet to only use this server as the source.

@derekgreer What is that server needed for? I'm simply using a UNC share for our internal packages, and that does everything I can imagine wanting a NuGet repository to do without an Internet connection.

@derekgreer wonderful. Could you point to the documentation? A casual peek at the NuGet doc page did not appear to speak to this scenario so am hoping you have details. Specifically, how do I download the packages from Microsoft and get them onto 'my'
server? I can't stand up a server and connect to the Internet at all, the network is closed. I have to actually move packages via DVD or some other medium. My entire point is that this should not be an issue. The assumption that
everyone is connected is quite the stretch.

Funny, was just on a plane and was screwed today. When I landed I start to write Scott H a note to ask how to handle this and then did a search and found his post. Guess my instincts were right on 1) Who would know how to deal with it, and 2) It is a scenario
that should be better addressed.

When my local nuget gets it, it automatically stores it in all the passive caches in the chain (the first two in this example).

The hierarchy should be configurable at the project level, with optional defaults at the solution-level and machine-level.

Each cache can also optionally know about its parent source, via a .nuget_source file, so when we look in \FileServer\DepartmentalNugetCache we don't find anything, so we check for the existance of \FileServer\DepartmentalNugetCache.nuget_source and find more
branches to go searching, which may or may not be the same as our locally configured hierarchy.

I just used package restore as an example. It should obviously work for adding new packages, too.

I vote for both an -offline option (fail-over to whatever is in %LOCALAPPDATA%\NuGet\Cache), and persistent caching proxy support for a NuGet server (I believe there's already a case for this). I think those are separate features and both useful for different
reasons.

Why let users intervene at all? I like this idea, but wish NuGet acted more like Ruby gems and bundler. If you've downloaded a gem, it first goes to a local gem store for a copy. If the requested gem is not found, then it is downloaded from the online
gem store.

Example: I should only download MyPackage-0.0.1 exactly one time. It is copied to my local cache and my project. I start a new project, and that also needs MyPackage-0.0.1. Nothing should be downloaded.

We already have a machine cache that is used during package restore (probably the most CI breaking scenario) and is in fact treated as a first class package repository. What wouldn't work is installing new packages etc

@sparrez - We already support custom locations for the package cache. It's not documented well enough, but you can specify a custom location by specifying an environment variable named "NuGetCachePath"

@gregmac - For new installs, we try our best to honor unlisting of packages or the worst case scenario of someone altering the nupkg on the server. For package restore, we do use the local cache.

We could certainly make it simpler by allowing the nuget.exe install command to install all dependencies too (it does not resolve dependencies today). Beyond that simply tinkering with the package source gets nearly all the asks in this thread.

This makes the local cache the primary source, followed by the official nuget.org site. I tested by blocking nuget.org without and then with this fix, but I haven't had it in place during an actual nuget.org failure yet. I didn't check but I assume this also
means it doesn't hit nuget.org at all unless there's a package that hasn't been downloaded yet, which should speed up builds and especially help for those of using CI and building dozens of times a day -- not to mention reduce the traffic for nuget.org.

I have to say, I'm still surprised this isn't the default way NuGet works.

I think that the nuget packages should be available offline. There could be multiple levels of caching - at machine level, project level, department/unit level, organization level, and ultimately nuget.org. When a package is not found in a cache, higher
level cache should be tried and whenever a package is found, it should be cached at subsequent lower caches. Of course, one should be able to control size of cache at each level.
I have seen this concept implemented in Windows Debuggers - they call it daisy chaining of symbols.
Another way could be like WSUS server - wherein Windows Updates are downloaded from Microsoft and made available to all machines in the organization. So that each machine does not have to hit Internet to get updates.

Limited support has been added to the next release of NuGet, i.e. checking machine level cache when other sources fail. Please note that machine cache only has the latest 200 packages, so it will help some offline install scenarios.

To give it a try, please download the VSIXes below and let us know your feedback. Thanks!