A tour of Windows Vista Beta 2

Originally posted by PhilipStorry:Using all memory for cache is a very, very dumb idea.

As another example... Mac OS X has a subsystem called the "Universal Buffer Cache". This is a cache shared (and supported) by the Virtual Memory system and the filesystem stack.

...at 20,000 feet...

As file data (data files or executables) is read in it is placed in free physical RAM pages if any exist. If none exist it will place the file data into a page that is currently holding prior cached data if any exist (using least recently used page selection). If none exist then it may swap out an active page that wasn't used recently.

Over time this allows unused system RAM to cache file data so that any future read attempts can come from the cache and not from slow disk IO. If you watch a Mac OS X system you will often see the number of "inactive" pages grow and the number of "free" pages drop (down to a limit on the order of 10s of MiB), this is often a result of the UBC.

Now if an application comes along and needs more pages of memory for its own purposes the VM will attempt to give it pages from the free pool if any are available (ones not currently caching UBC data or otherwise in use by other applications/processes). If not then the VM system will grab pages from the UBC (requires no disk IO, just evicts it from the UBC). Finally if no free or inactive page can be found then it will swap out active pages as needed.

This allows physical RAM to be fully utilized for file caching while still allowing the system to supply physical RAM to applications as they need it... unused physical RAM is a waste.

I love seeing the UBC cache entire 2+ GiB files on my PMG5 with 8 GiB of RAM and then see future access to those files come in at physical RAM speeds.

Originally posted by gwguy:Gotta disagree about your assessment of Mail. I've been underwhelmed with most of the offerings in Vista, but Mail (while appearing very similar to OE) is truly a completely new mail client. Hell, just look at the way it stores the mail for example.

BTW can anyone explain to me why Vista needs 700MB of RAM? Even turning all the pretty off I couldn't get it lower than 500...

Who cares how much ram it is currently using. The question you should be asking is if its able to free up that ram in an efficient manner when other applications request it. Besides, it should be using as much ram as you can give it. If its not then its wasting resources.

WAH????? no the os shouldn't be hogging 700MB of RAM just sitting there idle... that is CR@P....

it is proof positive that of the many things vista may or may not bring to the table...bloat is going to be right up at the top o' the list.

a) It's still Beta so it probably has lots of debugging still in it2) If the machine is sitting there idle, who cares how much memory it is using as long as it gets out of the way when an application needs it, your memory is using electricity anyway, you might as well be using it for somethingD) You obviously don't know much about modern OSs... they will use *all* available/unused memory for caching and the like, giving it up to applications as they need it. If I have a machine with 4G of memory, I expect near every single byte to be used for something all the time (caching, my applications, whatever) or the OS is NOT being smart. I'm paying for the electricity to store nothing? Put some cache in there or something so my user experience is faster, at least!

Ok I signed up just to talk about SuperFetch. I want to see if I can't explain why some people are not really grasping what this does.

It consumes almost as much memory as it can manage by pre-caching. This means keeping track of what you normally do and your favorite apps, and pre-loading them into memory, making them part of the file cache cached. Now, when you use one of those applications, launch time can be lightning fast, because you are not waiting on the disk.

Now if you need memory, you can just invalidate some of that cache. This is essentially a FREE operation. You can't compare the costs of giving up cached pages to the costs of fetching them off disk. It's many orders of magnitude more expensive to do one than the other. As someone else pointed out, it's not about how much RAM is used, it's about how easy it is to free it up.

That 700Meg isn't all going to get written to the pagefile when you need more RAM (a very slow operation, and typically why people get nervous when they see high RAM usage). It's simply free, and goes away. Tada, freed memory.

The differences with using ReadyBoost are pretty dramatic when timed. Without ReadyBoost, it took my system 49.93 seconds to launch Half-Life 2:Lost Coast (from the Steam app to the HalfLife2 Menu) and with ReadyBoost (after thrashing the system), just 34.9 seconds. Both launches were conducted with the system under heavy load.

Then this was most likely a bad test. Why bother even publishing it if you can't control the load? And before you say, "oh, but the load was about equal", don't. You'd have to know what was happening at a very low level to claim the load of the OS affected startup times in the same way in both cases.

Well, well, well. It looks like MS is frantically copying OS X functionality (of which I'm not a particular fan). Yet again, Microsoft is missing the point. For example -- their copy of expose looked almost useless: you couldn't even see what was in the Windows you were choosing between. Yes, it's a good idea to dupilicate expose if you can't innovate, but try not to sacrifice the functionality when you copy it.

It looks like they did a reasonable job copying the Dock, though. It's interesting to see how impressed by this functionality people are.

And the priv escalation is broken in the same way it is broken in OS X: it will simply teach users to type their password when prompted.

The differences with using ReadyBoost are pretty dramatic when timed. Without ReadyBoost, it took my system 49.93 seconds to launch Half-Life 2:Lost Coast (from the Steam app to the HalfLife2 Menu) and with ReadyBoost (after thrashing the system), just 34.9 seconds. Both launches were conducted with the system under heavy load.

Then this was most likely a bad test. Why bother even publishing it if you can't control the load? And before you say, "oh, but the load was about equal", don't. You'd have to know what was happening at a very low level to claim the load of the OS affected startup times in the same way in both cases.

Well, well, well. It looks like MS is frantically copying OS X functionality (of which I'm not a particular fan). Yet again, Microsoft is missing the point. For example -- their copy of expose looked almost useless: you couldn't even see what was in the Windows you were choosing between. Yes, it's a good idea to dupilicate expose if you can't innovate, but try not to sacrifice the functionality when you copy it.

It looks like they did a reasonable job copying the Dock, though. It's interesting to see how impressed by this functionality people are.

And the priv escalation is broken in the same way it is broken in OS X: it will simply teach users to type their password when prompted.

I wonder if the GUI is still vulnerable to shatter attacks.

I did it under load on purpose.. having 2GB of ram most things load quickly..

It was controlled as I simply loaded my machine up with same running apps both times..

WAH????? no the os shouldn't be hogging 700MB of RAM just sitting there idle... that is CR@P....

it is proof positive that of the many things vista may or may not bring to the table...bloat is going to be right up at the top o' the list.

I have 1GB on my linux web server. At this moment, only 17MB is free. Roughly 500MB is being used by file cache and 200MB is being used in buffers. All of which can be shrunk on an application needed basis. I'm betting that Vista is doing something similar. And I bet OS X does the exact same thing. I'd say redleader nailed it on the head.

Now, I am curious why it actually needs 15GB of space for a fresh install. It won't keep me from upgrading, but I'd like to know what that is being used by.

Edit: Must learnt to read thread prior to responding to message on first page... looks lik this has been covered.. :P

Some of my favorite comments here are from people who are trashing the look and usability of the UI based on the screenshots. I realize not everyone is going to like it, but basing anything on screenshots is just ridiculous. It's like evaluating gameplay based on screenshots. There's a certain "wow" factor, but it tells you NOTHING about how the game really works.

Also, for those who have been on The Intarnets for a long time are sure to remember, people ALWAYS complain about new UI elements when a new release rolls along...and yet, everyone gets used to it, and then starts to love it, and the next time a UI update comes along, out come the flamethrowers.

Originally posted by Kalila:I'd hate to say this, but...PNG's please...

I prefer to use PNG over JPEG, but PNGs were kind of impractical for this article, at least in terms of file size for some of the bigger screenshots. When I'd save them as PNG-8s, the image quality dropped as the gradients were dithered. PNG-24 made for very nice-looking images, but also extremely huge in some cases.

No, I think you're absolutely right. On one hand, I rather like the idea of making toolbars "fade" into a translucent state when they haven't been clicked or sized in more than X number of minutes after an app is launched. (MS did this with Office 2003 for OS X, for example)

On the other hand, making too many translucent windows leads to a cluttered desktop appearance. I remember trying that out in Linux for a while when "Enlightenment" was the latest craze - and it was highly distracting to try reading things like IRC text windows while seeing portions of other applications beneath them!

quote:

Originally posted by robotic_tourist:I may sound superficial to some, but I think the smoked glass effect used everywhere could be detrimental to actual usage. I wouldn't mind translucency round the edges of windows, but it's bad when it starts mixing around in and between controls and especially underneath some functional areas but not others (I think the WMP11 screenshot showed the worst offences).

Variable translucency for different areas of the window would be good though, as long as there are no breaks between functional elements to distract your eyes to between them instead of to either one or the other. Using the Windows explorer screenshots as examples, I think it's wrong to have a translucent area between the main content are and the location bar.

The demo I saw worked like this:As a manufacturer, take a laptop and close the lid. Stick a small, *external* display on the backside of the lid. Attach it to a teeny, very low power device running the .Net microframework that can access a bit of shared flash.

As a user, put your laptop to sleep and stick it in your bag.

Now every hour or so your laptop will wake up, check for email or anything else you instruct it to check for, and write the data to flash.

Whenever you want, pull your laptop out of the bag and look at the lid. The small display will show you how many emails you have (with a preview, maybe), weather, stocks, etc., all without having to start your laptop up.

I think it's a great idea. The component cost to add this to a laptop would be about $20. If you go with a version of the microframework device that can be untethered and get wireless updates, that price probably doubles.

Anyway, that's what the Slideshow technology is.

BTW, I've been clamoring for that "add fonts" dialog to change for years and years. Now that I see they haven't changed it in Vista, I kind of want it to stay! It's a nice tie back to 3.1. (I think it even uses the oldest-of-the-old listbox styles where to select more than one item--without the mouse--you have to enter "multi-select" mode with Shift+F6 or F8 or something...why do I remember crap like this.)

Originally posted by pope master:You didn't show a screenshot of using search in the start menu! That's a crime!

Its so awesome to hit the windows key, type in outlook, and hit enter to get there. Incredibly powerful and awesome.

In XP, you can start+R , type outlook, hit enter, and open outlook. Or do you mean bring forward an already running application?

In order to do that you have to know the executable name and it has to be in your path (or in some random Run registry key). With the search it will index all of the shortcuts in the start menu so you don't have to know the exact name or anything, just enough so that search can find it.

Another, really cool thing that my roommate noticed is that if you search for iTunes it brings up (in order) About iTunes and iTunes. He clicked on iTunes. Next time he did a search it was smart enough to put iTunes at the top above About iTunes so that he could simply hit enter. That's a really nice feature IMO for us keyboard people.

Originally posted by Lemurs:Some of my favorite comments here are from people who are trashing the look and usability of the UI based on the screenshots.

/me walks over to our test lab and plays with the vista beta (again)

Yup some of the uses of transparency in Vista still sucks (it sucked when Mac OS X did it as well), etc.

Ahh yes, of course. I was refering to you specifically, and not anyone else? Many people here have seen it, many have not, and yet they comment based on the screenshots. I have my own issues with the UI, but at least I work with it on a day to day basis.

In the real world, people who have serious issues with the transparancies will just turn up the intensity slider and get something a little more subtle and usable, while retaining Aero features.

I hope ReadyBoost & SuperFetch will have an option to use RAM instead of flash devices. I find the flash idea a bit weird. Say, I want to stick 4GB or more and just use 1GB for caching and drive acceleration. It would just reserve that gig for this purpose. Wouldn't that be hell of a faster than using flash devices and RAM is cheaper than flash devices as well. Also, I hope there will be TweakUI version for Vista I don't particularly like the sidebar and some of the eye candy.

If it did that then it would incorrectly deny certain file copy operations. There is literally only one way to accurately gauge whether a volume can successfully be the target of a copy operation, and that is to attempt the operation and see if it fails.

Originally posted by ferny:Does Vista check if there is sufficient space on a target drive before copying? Would be nice not to get the insufficient space message after 98% of data has been copied over to the target drive.

I sure hope so. I also hope that it's smart enough that if I delete/move 2000 files and (due to permissions) it can't operate on one of them it still finishes the other 1999 files no matter where in the list that single inaccessible file is.

This is what happens when the producer of product knows that the quality of the product has almost no impact on whether or not the producer continues to make billions in profit per year on that product --

Originally posted by ferny:Does Vista check if there is sufficient space on a target drive before copying? Would be nice not to get the insufficient space message after 98% of data has been copied over to the target drive.

How do you check for sufficient space? How does it address the issues of quotas, compression, or the varying amount of space that might be available as others use the disc (none of which are necessarily queryable by your system)?

And, frankly, how many people run into this issue? Most have huge swaths of HD space free.

a) coming up with something that actually worked is non-obviousb) the amount of people that would benefit from this is small.

Originally posted by ferny:Does Vista check if there is sufficient space on a target drive before copying? Would be nice not to get the insufficient space message after 98% of data has been copied over to the target drive.

I sure hope so. I also hope that it's smart enough that if I delete/move 2000 files and (due to permissions) it can't operate on one of them it still finishes the other 1999 files no matter where in the list that single inaccessible file is.

I believe that it is. I've certainly run int othe dialog that says something like "ignore, cancel, ...,

repeat this decision for all other files". (i don't remember the exact wording, but you get the gist).

Since there was no mention of the rumored screen resolution independence as being in Beta 2. Therefore I assume it will not be in the final, shipping version of Vista. Anyone have any insight as to this?

I'm not talking about scalable or vector graphics or icons. I'm talking about having a document showing in Word at 100% and having 72 point type actually be 72 points tall (approximately one inch tall) on the screen. I'm talking about showing an 11 x 17 inch document actually be 11 x 17 inches in size if shown at 100%. This would all be independent of whether my screen has 90 pixels per inch or 120 pixels per inch.

Originally posted by I Palindrome I:With the release to developers of Windows Vista Beta 2, we thought it was a good time to take a look to see how Vista is coming along. Our overview of Vista Beta 2 will tell you what you need to know about Microsoft's next OS release.

Well, after reading this review, I think Microsoft is getting things right in some ways. Too bad WinFS got cut. Caching network drives would give MS a leg up on Spotlight. Although I have to say that using Spotlight built into every finder window on a network share in folders with 5,000 TIF images is almost instantaneous over a gigabit network connection.

You can index network locations with Windows Desktop Search. You did not need WinFS functionality to be able to do that.

quote:

The decriptions of how things work in this review make me think that Vista will make people more productive.

But what struck me as funny is how Microsoft designs all their applications (Mail - what's with that obvious rip-off of Apple's mail client's name? I'm supried they didn't call it MailApp. ) to look like Explorer windows.

That makes sense to me. Outlook Express implies it's related to outlook. I.e. Exchange Capabilities, calendering, etc. I think the Outlook team would prefer they be able to use Outlook Express for some new product they're working on.

quote:

I mean, look at the WMP 11 interface. Some people describe it as "iTunes-like." Please! Steve Jobs would quit and move into the retirment home if he ever allowed them to make such an ugly interface!

You do realize that there have been many people who say that WMP looks better than iTunes. Again, beauty is in the eye of the beholder here.

Originally posted by Zak:I hope ReadyBoost & SuperFetch will have an option to use RAM instead of flash devices. I find the flash idea a bit weird. Say, I want to stick 4GB or more and just use 1GB for caching and drive acceleration. It would just reserve that gig for this purpose. Wouldn't that be hell of a faster than using flash devices and RAM is cheaper than flash devices as well. Also, I hope there will be TweakUI version for Vista I don't particularly like the sidebar and some of the eye candy.

On the other hand, making too many translucent windows leads to a cluttered desktop appearance. I remember trying that out in Linux for a while when "Enlightenment" was the latest craze - and it was highly distracting to try reading things like IRC text windows while seeing portions of other applications beneath them!

Little detail. What you describe is transparency. Which I also tried when it was all the rage and it indeed sucks.What is behind the window get's way too distracting.What Vista does is translucency, which is not the same in the sense that it's transparency with some sort of blur effect on it (an unflattering comparison would be the difference between regular windows and toilet/shower windows )Translucency does not distract you, in fact it just sort of fades away in a blurr of your peripheral vision. Which is nice. That's at least my practical experience of using Vista since december on my main work machine.

You do realize that there have been many people who say that WMP looks better than iTunes. Again, beauty is in the eye of the beholder here.

I agree. I've said this in the BF before, but Vista is shaping up to be a sweet upgrade to Windows - and I say this as a Mac user.

Putting aside the under-the-hood changes (which are numerous); the GUI is finally looking cohesive. The semi-transparent borders around each window looks good; the curved nature of them, etc, as well as dumping the garish colour-scheme of XP (which helps those of us who need the interface to not be right in your face).

Originally posted by PhilipStorry: There is no way on earth that the 1.5Gb of RAM in my laptop could cache all of those usage scenarios. There is also the minor fact that loading 1.5Gb of crud into RAM just so that it can be thrown out to make room for data is a really, really moronic thing to do because it's more likely to be causing delays than helping anything.

Why would it cause delays? A write cache certainly could, since it would have to be flushed, but you're assuming the cache isn't working, and if its cached writes for you, clearly it would be working. And anyway, the delay will happen regardless for a write cache, since you'll have to wait for the write to complete anyway if you don't cache. But in this case the OS doesn't have to erase anything. It can just clear a PTE and obliterate huge swaths of RAM in a single clock cycle.

quote:

Originally posted by PhilipStorry:

Sometimes, a bigger cache is a PITA - for all kinds of reasons, that we don't have time to go into here.

Hell merely booting Linux on this box shoots its memory usage up to around 1.5 GB from the boot procedure.

Strange. After I boot my Linux-system, and check the memory-consumption, it's around 150 - 200 Megs IIRC. And that's including the buffers and cache. Of course it goes up once I start actually doing something, but it most certainly is NOT using 1+GB of RAM after it has finished booting.

You're right in that purely during boot, the memory usage is nowhere near that high. But all of the services start way before X does, at least on my machine. Services like MySQL, Apache, Mythbackend, beagle, and a couple others, and a few of those (mysql and myth) touch very large files. So it's not really getting there from boot, but the services that start at boot (and I'd suspect that the main memory drain comes from mythbackend hammering mysql right off the bat).

The point was merely that an OS is supposed to use every damn byte of RAM it has. It takes the VM no time whatsoever to toss out a page of disk cache to make room for a program. There's absolutely no reason why a kernel should not be using that memory to make the system more responsive.

Originally posted by HL7:You're right -- but that's not how a disk cache works. Windows doesn't try to anticipate what you'll do next and begin to cache data into memory (not yet anyway -- it sounds as if SuperFetch may change that).

And it was SuperFetch we were discussing - which is effectively a prefetch cache for libraries.

That's why I gave the usage patterns I did. For common shared DLLs - standard Windows libararies, or possibly eventually .Net assemblies - it'll be OK. But the more apps you use, the less use a prefetch cache will be.

quote:

But you've provided nothing but personal opinion as rebuttal.

Relevant opinion, though. Whereas your reply is less relevant, because you're talking about read/write caches not prefetch caches - which we were also talking about. Sorry if that wasn't clear...

Incidentally, freeing a cache block may not be negligible. You need to know what's in a cache - you need to maintain an index of it. You can feel free to "just mark the memory as free", but if you do then your caching strategy will get confused pretty quickly. If you don't maintain some kind of index, your cache quickly becomes useless because you don't know what's in it. So it's not one memory operation, it's several to free up cache memory.

Your point about cache usage when writing large files is well taken, but I picked my usage patterns fairly carefully. If Windows is using large amounts of memory for a prefetch cache, and I then begin consuming that memory rapidly (by, for instance, loading Photoshop), the cache resize operations will mount up. Photoshop ends up consuming large anounts of memory, but does it in small nibbles as it allocates memory for each plugin when starting. Each time yoou load a new plugin, you'd get a new cache freeing operation from the OS as it makes room for it.

Of course, if all you ever do is use Photoshop, then each cache operation is still faster, because copying the data from the cache and freeing up the memory afterwards allows you to preclude a read operation for loading the cache. So in some scenarios, you win. Except for the minor fact that you had to load all those modules into the cache, so all you achieved was (potentially) reading them into memory on boot rather than on loading Photoshop. It's not faster, it just seems faster - because the work was done sometime else.

This is one of the slightly odd tendencies in modern computing, for me. You're not speeding the operation up, you're just doing it at a time when the user is more patient (during boot). It's an interesting trade-off, but I'm finding current implementations annoying.

Windows XP already implements a prefetcher (http://en.wikipedia.org/wiki/Prefetcher), which is part of my concern. The XP prefetcher doesn't use as much memory as SuperFetch seems to be billed as using, it instead builds a special on-disk cache that enables faster finding of DLLs for loading.

I find that my XP boxen start up and get me to a logon screen much faster, but are no more usable than the Windows 2000 machines. Why? Because half the time, not all my services are up when the XP logon screen appears. It's relying upon the performance of the prefetcher to get everything up and running by the time I've logged on, which just doesn't happen. My Windows 2000 boxen invariably get to a logon screen much later, but logon itself is faster because there's far less going on in the background as I log on.With loading applications, Windows 2000 is typically identical for anything not prefetch-cached. Prefetch caching gains me half a second or so - because I have fast hard disks and lots of RAM. So is the prefetch cache worthwhile? I'm not sure it is. It's using disk space and CPU cycles for a very negligible benefit...

Caching always sounds like it'll be a great solution to your performance problems. But as I found out when I tried to write them into my own applications over a decade ago, caches are much harder to get right than they seem. It really isn't a case of just throwing some RAM at the problem and hoping it goes away...

Originally posted by shawnce:As another example... Mac OS X has a subsystem called the "Universal Buffer Cache". This is a cache shared (and supported) by the Virtual Memory system and the filesystem stack.