PowerShell is everywhere, it seems. Not just in Windows Server, SharePoint, SQL Server, Exchange, Lync and Azure cloud, but it’s in third-party software, too. Take VMWare PowerCLI – that’s an extension of PowerShell.
With many in the Windows world chewing on this fat PowerShell server software sandwich it’s easy to take …

COMMENTS

And then you discovered 4DOS. Which was damn amazing. And the 4DOS tools worked on any DOS you put them on, too.

But, for years, my computer's scripting - every line from booting to loading drivers to starting games - was a combination of batch files, PC Pro / PC Magazine command-line utilities, and simple freeware.

STRINGS, CHOICE, AMENU, it's all coming flooding back.

Those were the days. When the computer did what you told it to and no more. And if you wanted, you could get 638Kb of base RAM out of 640Kb with enough drivers to play game X or run app X and just stick it in a menu. And a reboot took seconds and pushed you into a nice menu that loaded up the exact configuration you needed for a program, and you never even saw the jiggery-pokery to make it all work but you could at any point.

Can''t even squeeze a webpage into 640Kb now. Still have no control over what starts up or in what order when you start Windows and half the stuff you can't turn off without breaking completely unrelated features (did you know that if you stop the Window Search service, you can't then add a new keyboard language?).

Never used a batch file compiler because - well, you never needed to. A 386 was more than capable of churning through a batch file in no time at all.

I used to do a lot of command line stuff on an NT DEC Alpha. For various reasons I ran an 86 emulator to shut down 64 Progress databases and back the buggers up.All done through the auspices of a little batch file.

NT4 on the DEC Alpha was far superior to all the other OS's I had ranging from NT4 86 to Citrix Winframes and VAX..

I liked DR DOS too, but you can't really compare it to 4DOS, they were different things. And yes, I have run 4DOS on DRDOS (4DOS v3 on DRDOS v6, I believe, around 1991).

The only real problems I had with DRDOS were that (1) it wouldn't run Windows reliably (no great shock), (2) there was some funny bug that caused the internal "xdir" command to crash the PC occasionally for a reason I could never determine, and (3) it had problems running in a DOS box under OS/2, and *really* didn't like HPFS partitions. But as a standalone DOS replacement, it was great.

Re: Windows under DR DOS?

> I didn't have problems with windows under DR DOS 6, but then the last versions of DR DOS were overshadowed by DOS (er) 5 (?) that actually had some advanced features.

DR-DOS 3.4x supported large disk partitions when MS-DOS was stuck with 32Mbyte per partition (some OEMS (Wyse, Compaq,..) also had large partition support).

DR-DOS 5 offered EMS and HiMem and many utilities and was contemporaneous with MS-DOS 4.01. 20 months later MS caught up with MS-DOS 5. Then DR-DOS 6 added task switching and better memory management which took the best part of a year to almost catch up with MS-DOS 6. In the meantime MS contracted its OEMs with illegal 'per box pricing' so that users had to pay for MS-DOS even if they bought DR-DOS.

4NT for the win

It still is.

It's called Take Command, now, and it's an all-singing, all-dancing command process as well as a terminal on steroids (think of xterm in terms of functions).

The command processor can run separately; it's called TCC (Take Control Console), and there's a freeware dumbed down version (still orders of magnitude above the Command Shell) called TCC/LE. You can get it at JP Software, and it's strongly recommended.

I've played with PowerShell, but I still find I can knock out a TCC/4NT/Take Command shell script in a tenth of the time, and it does a hell of a lot more, and easier, than the PowerShell script.

Oh yes.... CMD.EXE /C

For many years, I did almost everything I needed with DOS (2.0 to NTDos) from building and maintaining uniform directory and permission structures for file systems to piloting remote desktop installations and operations with other CLI tools like psexec. I loved and hated it, really.

DOS had many flaws, shortcomings and weaknesses (still does). It is not even comparable to the power of mature *nix command shells, like BASH. However, it was very often good enough for the job, particularly when used in conjunction with other command line programs.

Today's IT youngsters (most of whom are morbidly ignorant of the CLI) don't know what joys and frustation they have missed. I'm not sure that is altogether a good thing.

As a colleague of my generation once said, "they don't remember how easy it was to completely f*k up a system with a few keystrokes" and the discipline that encouraged.

Re: Oh yes.... CMD.EXE /C

I remember setting up many machines with a basic toolbox of programs like 4DOS (and later 4NT), grep, PKZIP etc before I moved on to just using cygwin.

I think everybody in IT from the 80s and 90s who used DOS/Windows would have a collection of tools to extend and make MSDOS/CMD actually useful.

Thing I never understood about MS, was the apparent 'not invented here' approach to releasing better tools. They could surely have brought the rights to bundle tools like 4DOS into Windows and quickly improved the rudimentary command line.

Some bundled Windows utilities like Notepad don't seem to have changed much since Windows 3.11 days, despite Microsoft having better free editors available in-house.

I was an SDIR and Norton Utilities man myself, but yes you could do a heck of a lot back in those days. We were using QEMM and Quarterdeck for our memory manipulations. And of course every time MS released a DOS update, they broke.

Piping is the next powerful ability of PowerShell. Piping uses the pipe symbol | to split commands and feed the latter to the former. So get-childitem | where name -notlike Windows would show you the directory listing, but excluding anything that matches the name Windows. You can't do that with a single command prompt line.

Yes you can. The MSDOS command shell supported piping. Try this:

dir c:\*.* /s | more

That's not what the example asks about but it does demonstrating that piping commands was available in MSDOS and had been since 3.x - maybe earlier for all I know. I don't think there there was a built-in command you could pipe to that excluded by name but it wouldn't have been difficult to write one.

Except you could only pipe into certain commands and IIRC you could only pipe once - you couldn't daisy chain them. MS never really "got" the purpose of stdin, stdout & stderr. They still don't as far as I can see.

Well..yes. Piping only worked with programs that had been written to use stdin/stdout. It's unfortunate that for performance reasons a lot of command line programs chose to perform direct I/O rather than going that route but I'm not sure you can call that a limitation of MSDOS. MSDOS piping works with any application that sticks to the MDSOS API.

PowerShell commands output structured data, not text. In the absence of a consumer, the data is converted to text, but if you pipe it, then the consumer receives objects. In the example in the article, the "where" command filters its input objects by examining their "name" attribute.

If you've ever had to write a lot of shell-script on Linux, I'm sure you'll appreciate how useful this could be.. especially as many really useful Linux tools provide such machine-unfriendly output.

PowerShell commands output structured data, not text. In the absence of a consumer, the data is converted to text, but if you pipe it, then the consumer receives objects. In the example in the article, the "where" command filters its input objects by examining their "name" attribute

Ah, I was wondering when this would start appearing. I could smell XML inter-proc pipes a while back. I remeber an XML 'ls' somewhere, with the output terminal taking cues. Next up, p2p negotiation, maybe ending up with JIT compilation... mm.. evil..

Ah now, that I agree with. Sorta. Except that MSDOS was never claimed to be a multi-tasking operating system so it's a bit harsh to criticise it for the workaround of using a temporary file. Clearly Microsoft were aware of the importance of piping and went to some lengths in order to fake it.

As for not supporting objects - well yes that's true but the article didn't use objects for that example. It quite specifically mentioned directory listings. As another commentard with more time on his hands (or a better memory) has pointed out that there was such a filtering program available so the specific example given in the article is entirely possible under MSDOS.

And a note to the downvoters - I'm not attacking PS here. I know it's better and I love it - have interfaced to it from C# on several occasions. The only reason I've posted these comments is to point out a factual inaccuracy in the article.

Microsoft didn't really "get" the idea

The only reason I've posted these comments is to point out a factual inaccuracy in the article.

And there are others... I get the impression the author doesn't use Windows command line much except for PowerShell, if that. Too, there were other MS scripting possibilities not mentioned in the article (e.g. cscript/wscript, VBScript, JScript). I've had the... joy? of working with one incarnation of MS-DOS AKA CMD or another for 30 years now. While I think that it PowerShell is interesting in the way it does things and am pleased with the return to using command line as the default in MS OS administration, I find the change from CMD to PS as jarring as moving from anything else to Windows 8. I've written scripts to be run on a variety of *NIXes and am having a harder time shifting to PS than learning any of these from scratch. Maybe I have just gone from getting to being old.

PS has a few neat tricks like being able to specify output types that are native to MS Office formats, but I have been able to do that more generically using CSV and RTF for years. Except for things that were designed and created with PS as the default scripting language, I haven't run into anything that I couldn't do previously with CMD.

Essentially, MS has done to admins what they have been doing to all their other users: changing everything, telling us it is for our own good, and forcing us to relearn things that we have been able to do just fine for years. Not much of a production boost as far as I can see, but it is the Microsoft way.

Re: Microsoft didn't really "get" the idea

Except that MSDOS was never claimed to be a multi-tasking operating system

But it *nearly* was.

The original design had all task-context data in a swappable chunk - the SDA. By changing the SDA pointer, you changed the context. There was some grief with changing that whilst certain non-thread-safe DOS operations wre ongoing - hence the InDOS flag - but it had clearly been built with multi-tasking in mind.

Sadly, this doesn't ever seem to have been completed (hence the single-tasking nature of what shipped), and each new version seemed to have more and more static data that wasn't in the SDA, leading to lots of work-arounds and side-effects :-(

Pipes

After MS-DOS 1, Microsoft promised proper pipes and multitasking, then delivered MS-DOS 2 with the 'make the first program finish before letting the second one see the results' bodge that lasted for the rest of MS-DOS.

Re: Pipes

> Just like (while I'm here), DOS 3.x did support "partitions larger than 32 MB", through resident driver chaining, and from DOS 2.x supported large disks through installable block devices drivers.

Not from Microsoft it didn't. There were 3rd party add-ons. Some OEMs modified the system, in different ways, to support larger partitions, for example I used 'Wyse-DOS' 3.31 with this. IBM was annoyed that other OEMs had features that were not in PC-DOS (or standard MS-DOS) so they wrote code to create PC-DOS 4.0 and gave it back to MS for MS-DOS 4.0x

http://www.os2museum.com/wp/dos/dos-4-0/

"""Perhaps the most significant change in DOS 4.0 was the introduction of 32-bit logical sector numbers and the consequent breaking of the 32MB partition size barrier. That change wasn’t strictly speaking new, having been first introduced in Compaq’s DOS 3.31 in late 1987. However, beginning with DOS 4.0, every OEM version (starting with IBM’s) supported large partitions."""

Re: Pipes

"IBM was annoyed that other OEMs had features that were not in PC-DOS (or standard MS-DOS) so they wrote code to create PC-DOS 4.0 and gave it back to MS for MS-DOS 4.0x "

The legend is that MS said "DOS is done", fully baked, no more work needed, etc.

IBM added the extra stuff as proof of concept and MS immediately took it(*) to sell as DOS 4.0 - which was unfortunate, as it was bug-riddled. A lot of early adopters lost the entire content of their systems.

(*) The licensing conditions for DOS included a clause that any modifications were the property of Microsoft. This wasn't unusual - Rockwell included the same clauses in its modem chip licensing,

Thankfully at least one machine I owned (Sanyo MBC550) wouldn't run anything newer than DOS 2.11, so I was spared that carnage until well after the event.

Easier than ever nowadays

If I recall correctly, Unix specifications at the time did not require that pipes be implemented in any particular way, and the Microsoft way would have been suitable, although less than ideal. What really counted, though, was that the operating system provided for such things, and pretty much the entire set of standard utilities used stdin and stdout and allowed the shell to connect them fairly arbitrarily using pipes.

> Unix specifications at the time did not require that pipes be implemented in any particular way, and the Microsoft way would have been suitable, although less than ideal.

Named pipes are a feature of the Unix (and Unix like) operating systems. They provide arbitrary data connections between programs. It happens that various shells can use pipes to connect stdout of one program to stdin of another. MS-DOS doesn't have pipes but the shell can provide an emulation in some cases.

Actually, that one would fail, as all directory listings were uppercased, so you wouldn't get any files listed.

One very hand piped command I used to use a lot:

CHKDSK /V | FIND "textstring"

CHKDSK /V all by itself would give you a scrolling list of every file on the disk. Piping that output to the FIND command would result in a list of only the files where textstring matched. A 'file find' program built in to DOS, and it was free.

How is "dir c:\*.* /s | more" the same as "So get-childitem | where name -notlike Windows" - yours will stop at each page waiting for a keypress, mine lists everything excluding the directory name 'Windows' ?

Re: Obscure knowledge got me a job ....

Re: Obscure knowledge got me a job ....

Ah, con, aux prn, still lurk in today's cmd line.

copy con "file.txt" (CTRL-Z)

Not so long ago a test program I was using, designed to test file system security permissions, would occasionally randomly break, for no apparent reason. The youngish chap who wrote it was randomly generating file names. After digging through a lot of tracing, every now and then, the random file name generator was attempting to create files called "con", "aux" and "prn", creating a big spanner where none was expected.

Re: Obscure knowledge got me a job ....

Ehhh, I still use copy con when I'm creating small test files. Old habits die hard.

And actually, on several of my 2012 servers, I use batch files for backup jobs (along with an rsync client). Easier, more comprehensible that Powershell, IMHO. And more straight-forward. Put too many powershell one-liners in a script, and a year later I'm like "what the fuck did I do here?" Only time that happens in a batch file is if I try to get really fancy with a FOR command.

Powershell has some uses, although after 4 years of using it, I really think it's main strengths are dealing with "new" Windows or Office365 features that are Powershell enabled. For stuff like that (especially dealing with O365, and to a lesser extent, AD), some of it is much, much easier to do in Powershell than in a GU, webpage, or a batch file. But many times it's easier to do something using a simple batch file and an executable utility rather than try to figure out the arcanum to invoke it in Powershell (or worse, have to drop to the underlying .NET stuff).

And the key to keeping your sanity is to remember Powershell is a SCRIPTING language, not a PROGRAMMING language, even though MS tried really, really hard to make it look like programming.

Re: Obscure knowledge got me a job ....

To each his own, I guess.

I never cared for VB script. For places where I could have used it, I'd usually go into VB6 instead. Always seemed easier to just copy an EXE around to various machines than deal with the scripting engines.

Re: Obscure knowledge got me a job ....

"stuff like rename that zip file with the prefix 20150428 is a right PITA with batch files."

Yes, but the newer versions of the SET command can do that kind of stuff fairly easily (for some values of "easily"), it's just dog-ugly syntax and a real "WTF?" moment a year or two later without comments explaining it.

Re: Obscure knowledge got me a job ....

I actually went out and bought a copy of IBM DOS 3.3 specifically because the manual which came with it came with full explanations and examples for every single command.

The MS-DOS manuals were poor things by comparison.

That IBM manual stayed around until the late 1990s, because it was a useful reference tome. It went walkies when shifting house and I suspect one of my "helpers" decided it was more use to him than to me.

Re: paths

Even most clueless unix sysadmins generally set "." as the _last_ PATH entry.

With DOS it was the first one, which made trojan horsing much easier

I had to deal with a number of infestations caused by that and the old BIOS flaw of looking at the floppy before trying to boot off HDD. Every single computer I worked on that could have the boot order changed, did so (often to the consternation of "experts" who would advise users not to boot with floppies in, and then have their demonstrations fail)

Re: paths

With DOS it was the first one, which made trojan horsing much easier

True and arguably it's even worse than that. That's because command.com always looks in the current directory and only resorts to PATH after checking the cwd. So even if your PATH variable is empty you'll still execute programs in the cwd. I'm pretty sure there is no way to stop command.com looking in the cwd first but I vaguely recall that with 4DOS at least if PATH did contain '.' then it overrode the default behaviour and thus at least allowed you to push it further down the list.

Re: paths

Yes, PCDOS 1.x was primarily a floppy disk system, and it always ran programs off the current floppy disk, and did not require a path to do so. When it morphed into something completely different, this started to be a problem.

no, it did not require a path, and dot was never an element of the path. It did become common to put things like root or dot dot into the path, and the order of the path was commonly set for optimum speed, because, until very late in the piece, MSDOS did not cache the file listing, and even on Hard Disks, a path search could be noticeably slow.

Re: paths

Piping and conditional logic

Thats where command lines excel. GUIs are great for single tasks that can be visualised - eg drag and drop or button clicking - but for disperate non visual or abstract tasks that need to be linked together and require some glue logic and maybe looping , well , some GUIs have been designed that can do that (Scratch programming language for example) but its easier just to use an old fashioned command line whether its powershell on windows or bash on linux.

Re: Piping and conditional logic

Re: Piping and conditional logic

"GUIs are great for single tasks that can be visualised - eg drag and drop"

If you've ever seen what goes on behind the scenes with "drag and drop" you'd run screaming.

Multiselect (Unixen, Macs and Windows alike) results in a series of INDIVIDUAL commands. I've seen many systems brought to their knees by that kind of shitty behaviour (particularly if the target directory is on a network fileserver)

"Commands and paths had to be typed in full in the MS-DOS days, there was no fancy time-saving command-line auto completion. This feature popped up much later in the picture, when the command prompt had almost become a forgotten artifact of the pre-GUI era."

DOSKEY was available from MSDOS 5.0 and later.

Powershell is powerful but, I use it maybe once a month, and then mostly for Chocolatey.

But I have a command window open on my machine nearly all day. At the moment there are 3 of them.

Dir *.txt is a hell of a lot faster to find a file than trying to find it in a window.

And there were many other keyboard and command line enhancers before then. My personal favourite was Chris Dunford's CED (command editor) back in 1987, three years before DOS 5 came out. I liked it enough that I bought the the professional version, PCED. Sadly, it had some conflict with the Smartkey keyboard enhancer, as I recall, but by then, we were already playing with the (late, but not lamented) Command Plus shell, and later 4DOS, rendering DOSKEY moot.

Sadly, I now have a quirky application which for unknown reasons doesn't like to run in TCC/Take Command, and so I *must* launch it from a DOS shell, forcing me to learn/relearn DOSKEY, thirty years later...

Discovery

I was completely unaware of Powershell until I found it on the Win 10 Tech Preview. It is a definite step in the right direction, I suppose, but for me it won't replace the cygwin linux command line / utilities package I always install on a new system before I do anything else.

Re: Discovery

I stopped using powershell the moment I realised that a simple AD command to do something (I think it was related to promotion of a certain role, but can't remember off-hand) had gone from an 8-character name to something so long and unguessable that - even with autocomplete - there were ten similarly named, stupidly long, easy to confuse commands and that in any tutorial they had to be written out correctly and not jump off the sides of the screen when you typed them because otherwise it was too easy to hit the alternates.

That's not what you want when you're playing with AD in a Powershell box.

Re: Discovery

I always thought about the admin at 3am in the middle of an IT crisis in mind. I thought about the desperation that person would feel if they were trying to understand what had occurred and opened up a PERL script and needed to understand it. When that person opens up a PowerShell script, they will be able to read it and understand what happened.

That is why things tend to be more verbose - because verbosity is your friend when the chips are down. As you correctly point out, verbosity is not your friend for interactive use. That is why we provide aliases, positional parameters, wildcards, etc.

At the end of the day, we build tools to make you successful so if you are successful with the tools you are using - then it's all good.

Re: Discovery

"I always thought about the admin at 3am in the middle of an IT crisis in mind. I thought about the desperation that person would feel if they were trying to understand what had occurred and opened up a PERL script and needed to understand it. When that person opens up a PowerShell script, they will be able to read it and understand what happened."

People can write unreadable code in pretty much any language out there, and at 3AM the chances are groking anything is going to be harder than usual... So instead of forcing people to learn new tools, syntax and conventions at 3AM how about just using presenting them with something familiar & well proven - like Python packaged with a bunch of libs to facilitate doing tasks on Windows boxes ?

Re: Cygwin

"Anyone else remember that Windows NT originally came with a POSIX-subsystem?"

Yes, I do. I mainly remember because it wasn't actually shipped with the POSIX subsystem in working order (as of NT 3.51), you had to install it off an extra CD. The advertising was very misleading. In my experience that feature was successful at convincing mentally defective PHBs that NT could run code currently running on UNIX boxes than it was actually doing it's job...

GUIs can be great ...

Because a good GUI can help a great deal towards presenting a quick logical overview of what on earth it is you are doing. You can grey out controls, or link them so that you know selecting an option requires addition parameters. You can also ensure mutually exclusive commands can never be issued. And you can provide tooltips to assist in more obscure or lesser used options. Best of all you tend to work in generics, rather than specifics - you want the outcome to be "Delete temporary files on completion" - or is it -B ? -D ? --delete-temp-files-on-exit ? --cleanup ?

re: Windows XP was the first PC operating system to drop the MS-DOS

Re: re: Windows XP was the first PC operating system to drop the MS-DOS

I think he meant the first consumer-facing system. They ran in parallel with 95/98/ME and were intended for serious applications (proper 32-bit programs, multi-user, etc).

Sadly in the push to make consumer & professional lines converge and be fast enough for gaming, compatible with older badly written software (some of it MS' of course!), etc, a lot of dumb decisions were made w.r.t. security, etc.

Re: re: Windows XP was the first PC operating system to drop the MS-DOS

Also, I think he messed up about Windows ME ??? - one of the great complaints about it was that it wasn't possible to just "boot into a pure MS-DOS prompt by pressing the right start-up bypass keys", because they had removed that feature ???

Re: re: Windows XP was the first PC operating system to drop the MS-DOS

Re: re: Windows XP was the first PC operating system to drop the MS-DOS

The same paragraph explained that:

Windows XP was the first PC operating system to drop the MS-DOS Prompt and change it to Command Prompt, due to a change to the NT kernel. The Windows NT family has used the newer Command Prompt since it started with Windows NT 3.1, so it was nothing new on that side of the fence.

Re: The same paragraph explained that:

Funny thing is ...

... those of us who were already using UNIX[tm] in the early-mid 1980s found MS-DOS's command.com to be a brain-dead command interpreter. To this day, Microsoft hasn't really figured the concept out. IMO, of course.

Back in the 90's I used to teach courses on how to get the most amount of memory out of 640Kb. I also wrote DOS programs (anyone remember EasyEdit - the best text edtor at the time (so Byte said)) which used overlays to move stuff in and out of extended memory to leave the most free real memory.

The "640K should be enough" attributed to Bill Gates is a myth. On the original IBM PC, MS/PCDOS could use 760K(ish) of so-called "low-mem", before it ran into IBM's built-in hardware stoppage. Which was an IBM hardware issue, not a Microsoft coding issue. Eventually, we figured out how to use nearly 950K of low-mem.

The real "should be enough" quote was from Steve Jobs, when demoing the original Apple Macintosh at the Home Brew Computer Club, a couple weeks before the official unveiling. He said, and I quote, "256K should be more than enough for home users" ... and he had a point. We had flight simulators running in 64K of RAM back then.

EasyEdit? I've been using vi from time immemorial ...

Sometimes I look at the modern world and despair over the sheer waste ...

Re: @ Alan Sharkey

> On the original IBM PC,

On the _original_ IBM PC (5150 Model A) it would only support 256Kb max, no mattter how many cards you could afford. Base memory was 16Kb for ROM BASIC and Cassette port. Model B (I have one here) supported max 640Kb.

> MS/PCDOS could use 760K(ish) of so-called "low-mem", before it ran into IBM's built-in hardware stoppage.

IBM reserved the areas above 640Kb for hardware adaptor memory. The CGI card occupied addresses at 640Kb. If only a MDI or hercules card was used then another 64Kb could be used to give 704Kb. Anything beyond that required memory management hardware such as an EMS or EEMS card that could switch address spaces around.

However, later MS-DOS (5 or later), DR-DOS, QEMM or others on a 286 or later could emulate EMS and could shift the OS into high memory or beyond 1Mb.

> Eventually, we figured out how to use nearly 950K of low-mem.

Not on a 8088 based PC or PC XT you didn't. There were machines that could support almost the full addressable 1Mb of a 8086/8088. SCP Zebra series for example, or other S100 bus based systems. The Sharp MZ-5600 that I have here also could utilise 512Kb for OS and programs the other 512Kb address space was reserved.

I do have other 8088/8086 machines that can use the full 1Mb but they run Concurrent-CP/M-86 on several serial terminals.

> Which was an IBM hardware issue, not a Microsoft coding issue. Eventually, we figured out how to use nearly 950K of low-mem.

Re: @ Alan Sharkey

From /In the Beginning was the Command Line/ by Neal Stephenson

[...] Note the obsessive use of abbreviations and avoidance of capital letters; this is a system invented by people to whom repetitive stress disorder is what black lung is to miners. Long names get worn down to three-letter nubbins, like stones smoothed by a river.

This is not the place to try to explain why each of the above directories exists, and what is contained in it. At first it all seems obscure; worse, it seems deliberately obscure. When I started using Linux I was accustomed to being able to create directories wherever I wanted and to give them whatever names struck my fancy. Under Unix you are free to do that, of course (you are free to do anything) but as you gain experience with the system you come to understand that the directories listed above were created for the best of reasons and that your life will be much easier if you follow along (within /home, by the way, you have pretty much unlimited freedom).

After this kind of thing has happened several hundred or thousand times, the hacker understands why Unix is the way it is, and agrees that it wouldn't be the same any other way. It is this sort of acculturation that gives Unix hackers their confidence in the system, and the attitude of calm, unshakable, annoying superiority captured in the Dilbert cartoon. Windows 95 and MacOS are products, contrived by engineers in the service of specific companies. Unix, by contrast, is not so much a product as it is a painstakingly compiled oral history of the hacker subculture. It is our Gilgamesh epic.

What made old epics like Gilgamesh so powerful and so long-lived was that they were living bodies of narrative that many people knew by heart, and told over and over again--making their own personal embellishments whenever it struck their fancy. The bad embellishments were shouted down, the good ones picked up by others, polished, improved, and, over time, incorporated into the story. Likewise, Unix is known, loved, and understood by so many hackers that it can be re-created from scratch whenever someone needs it. This is very difficult to understand for people who are accustomed to thinking of OSes as things that absolutely have to be bought.

Re: Which is to say...

"More accurate to write: "slowly reseparating". Recall that Cutler's team started with text-mode before Bilge ordered the GUI bolted on regardless."

As good as some of Cutler's work has been and as smart as he is, I feel people are a bit too quick to put him on a pedestal when it comes to WNT.

1) I would fully expect WNT to development to have started out with "text-mode" - simply because developing all those graphics drivers, GUIs and supporting libraries would have taken a very long time. I would *expect* Cutler et al to have debugged & interacted with those early kernels via "text-mode" over a RS232 port, or perhaps via VGA card (text only - natch).

2) When Cutler was hired & developing NT, GUIs were the thing people wanted to buy, therefore he should have known up front that a GUI would be the main way of interacting with the new OS, he would have to have been deaf dumb, blind and terminally retarded not to see which way the wind was blowing at Redmond. To give Cutler his due, I am fairly certain he would have had a big problem with a lot of aspects of the bits outside of the Kernel on WNT, and would agree that WNT would have looked totally different if Cutler had full control over it's development... Pretty sure he would have strangled Win32 in it's cot for starters. ;)

The reason why OSes & drivers were often developed in "text-mode" is driving an RS-232 interface or VGA card doesn't require much in the way of code and there is very little to go wrong with it. For those reasons a lot of UNIXen, their admins & users have carried on using "text-mode". That said I fully expect pretty much any Linux distro to boot into a GUI and work by default these days. ;)

Re: Which is to say...

Microsoft is still learning to reinvent Unix -- slowly separating text-mode core OS from graphical layer; learning the importance of a rich command line; learning to write graphical commands that emit said CLI, easing automation. But it's not doing it terribly well.

You could say the same thing in reverse for Linux and the desktop. I can't recall the number of flavors of Linux GUI/Apps I've tried over the years just to toss them out because they were too much hassle to make work.In the end, as a consumer of a desktop OS, I want to use it for productivity.

Ubuntu is the latest trend and it is getting better, but I would never throw it at my users.

And managing users in nix is a joke. LDAP is the king, and MS has the single best implementation of that technology to date.

Back on topic, Unix systems have had, hands down, the best command line power for decades. At this point, I'd say PowerShell is getting MS to where it needs to be to be a serious tool for command line junkies. But it sure wasn't there to begin with!

Not dead yet

Curously, there seems to be evidence that the command prompt in Windows still has a tiny spark of life in it. (Or maybe I'm late discovering features in the obscure and hard-to-find documentation.) "set /?" now delivers three screens of help, and includes features like string replacement and delayed variable expansion. You can write surprisingly capable scripts now. Unfortunately there seems to be some rule that any new feature has to be invoked by obscure metacharacters. I suspect that this is a legacy of the original feeble MS-DOS parser.

I'm reasonably sure that the first versions of MS-DOS did offer command-line editing. It used function keys F1 to F9(?) and it's still available in Windows 7, although some of the functions now produce a popup prompt which obviously wasn't there in MS-DOS.

Re: Not dead yet

I'm reasonably sure that the first versions of MS-DOS did offer command-line editing.

I don't remember a version that didn't - although I might simply have forgotten some...

The early stuff used F1-F3; F1 would repeat the next character in the history buffer. F2 and a character would repeat up until the next occurrence of that character (and ISTR you could prefix a number to repeat up until the nth occurrence), and F3 would repeat the whole line.

Written badly

"Windows XP was the first PC operating system to drop the MS-DOS Prompt and change it to Command Prompt, due to a change to the NT kernel. The Windows NT family has used the newer Command Prompt since it started with Windows NT 3.1, so it was nothing new on that side of the fence."

All versions of NT from 3.1 to the 5.1 (XP) had choice of 32bit native console (looked like a DOS prompt) or running MSDOS shell via NTVDM (Real DOS prompt).

NT also could run OS/2 scripts and console executables as well as native NT scripts.

Powershell

Behaves more like a base scripting language (a summary of the worst of bash, php and perl)

Then you get application extensions to provide extra functionality for the base scripting language to be of any use.

In Unix you have scripting languages and anything that you install/add to the system is already available to anything else that you can call from the shell, or anything that can start a shell, regardless of the language.

Say whatever you want about PowerShell object oriented usage, you will not do much with Exchange objects feeding them to VMWare unless vmware adds support for them, nor you would be able to use those objects on tomorrows latest fart without MS's intervention.

As I said to a Windows colleague once, "Relax; you just have discovered scripting, I was as excited as you when I discovered what you could do with .bat files back in the late 80's"

'MS-DOS was lacking other features, too, that many would now consider unforgivable. After typing out an incredibly long command and realising there was an extra letter at the very beginning, all you'd end up with was an unusable chunk of text.' - a bit like when you've composed a txt or email then realise you have a spelling error and are sadly using an iPhone instead of a phone with editing capabilities.

Re: UNICODE/UTF8

That's pretty bad. I understand that Powershell is considered by its users as superior to bash, but at least that's a problem that bash does not have.

I can totally imagine the reasons for which MS would have developed its own rather than going with bash, between the fact bash was considered the competition, that it would have been losing face to adopt it, that they were intelligent enough to create something better, influential enough to get their solution accepted, and so on…

Feels a lot like something Google would do nowadays. MS seems to have grown humble in comparison.

Re: UNICODE/UTF8

Bash?

The MS-DOS shell, like most things Microsoft, is just a poor copy of a standard program. In this case its 'sh' or, in Gnuese, 'bash'. Even with teaks and enhancements it doesn't do most of things you can do with a real shell, including starting scripts as programs with the '#!' construct.

If you do a lot of embedded work but you're stuck with a Windows platform (a common scenario) then you find that the tools for the most part are 'ix' based, using Cygwin as a shim to give you something like proper OS functionality. This has the side effect of not only giving you a bash to work with if you need it but also being able to use standard commands directly from the Windows command prompt.

Upgrading Windows programming with NIX concepts

For more than a decade, most Microsoft Windows users who were totally ignorant about the power and productivity of the Command Line Interface (CLI) and BASH/other Shell scripting tools in *NIX Operating Systems (OS), condemned the non_Windows OS as "stone age" and unsophisticated, not knowing that a significant amount of administrative work and systems configurations were virtually impossible to perform "efficiently and quickly" via any Windows GUI, especially on Windows Servers, Networking operations and for many qpplications progrmming projects in PC desktop envionments.

Most of my Microsoft Windows severe critics from 8 - 10 years back are "now" mentally blank about their earlier views while they take to (painfully) learning and understanding the inevitably of Windows Powershell in modern technology.

Re: Upgrading Windows programming with NIX concepts

> condemned the non_Windows OS as "stone age" and unsophisticated,

Microsoft worked hard to make their CLI very poor so that they could point out how useless it was in order to convince users to switch to GUI. Even when MS wrote a semi-decent CLI enhancer for Windows 95/98 they didn't install it automatically, didn't mention it in the manual and hid it away.

They even seem to have removed command line options from programs (such as net) so that users were forced to use the GUI rather than have a batch file do stuff automatically.

Falcon 3.0, the ultimate test of your config.sys/autoexec.bat skills?

From what I remember from my dim & distant past is that Falcon 3.0 required around 605K of the 640K available to run in AND need access to extended memory. So you needed to load the extended memory driver (himem.sys), a mouse driver (it needed one) and a sound driver enabler into 35k of memory, as well as the core OS...

This was made slightly more, erm, interesting, by the fact that the order in which they were loaded would affect how much memory they took up. Creating bootdisks for Falcon 3.0 was an art!

My personal demon was MechWarrior2, which needed to be run off a parallel port connected zip disk because there wasn't room for it on the C drive. Himem, CDRom drivers, Sound drivers, joystick drivers LAN drivers and Zip drive drivers... Took DAYS to finally get that bastard to start.

config.sys/autoexec.bat fail!

I remember spending Christmas morning struggling with config.sys and autoexec.bat to get 'Magic Carpet' to work on our 386 computer. It had been a present for 10 yr old son; after about 3hrs I just got it to work, by which time son was totally disillusioned.

Re: config.sys/autoexec.bat fail!

"I remember spending Christmas morning struggling with config.sys and autoexec.bat to get 'Magic Carpet' to work on our 386 computer. It had been a present for 10 yr old son; after about 3hrs I just got it to work, by which time son was totally disillusioned."

Funnily enough I'm still going through that nightmare with our < 10 year old kids at the moment. I installed Win 8.1, dutifully slotted in a Disney DVD and waited for it to play... OK, so there's no DVD playback, kicked off a VLC download and figured I'd put on some music from the DLNA server while we waited... Ah of course Win 8.1 doesn't support FLAC presumably because Microsoft can't afford to pay someone nothing to bundle it... As with most prior Windows installs it turned into a really boring afternoon packed with disappointment.

Needed a mouse, sound, and a 4-color light pen driver to be loaded, before it ate about 600k of the 640. This was on a "true" IBM 286 w/ the 287 math co/pro.

This was the t-o-t-l (top of the line) system - 16 colors, 4-voices, 2-button mouse, and an 84(?) keyboard. Propitiatory 8-bit comm card, and software to drive it, dual UART232 for a dedicated weather station, (on a short-haul modem) and their "custom" mouse/brick (with a "special" pad), Centronics 36 parallel port to a 9-pin (IBM) printer, and a DB-9 to the "largest" (14inch) color display..

Ahh, kids these days... I don't know.. That crap music they play, touch screen this 'n that, social media....

command-line programs, such as diskpart,

Alias

Don't fret, children. You don't really have to type get-childitem to get the contents of a directory. Those memorable commands still work as aliases. You can also type ls. That's what I usually use. I'm thankful they added that in as a default alias. Switching between Linux and Windows systems I used to always manage to type the wrong one... :)

As for the cmd line being dead. It isn't really. You can still type it from the run box or a powershell window and get all your old commands back.

You can also run commands the old fashioned way in PS by using invoke-expression or invoke-command. Then you can do all sorts of nifty management things with it like error control, writing events to log files or the windows event log, etc.

To all of you die hard batch writers, give PS a chance. You may just expand your scripting chops and stay relevant to the IT world while you are at it...

fondly

"It's easy to look back now and wonder how people put up with such a manual, non-user-friendly system, but personally I still look back on it fondly. Many hours were spent learning every command available, all the switches and what they do."

Yes, indeed. I remember the customer's employee who was working his way through the commands manual one night, and zapped most of the data.

I have done a little scripting with PowerShell, but have not yet warmed to it. I'm not sure why.

Quote: Windows XP was the first PC operating system to drop the MS-DOS Prompt and change it to Command Prompt, due to a change to the NT kernel. The Windows NT family has used the newer Command Prompt since it started with Windows NT 3.1, so it was nothing new on that side of the fence.

ZCPR

As someone else mentioned, MS seems finally to have approached the state of 1975 Unix (or maybe 1983, with Korn shell). I have to add ZCPR for Z80 based 8 bit systems. If I recall correctly, it had a passable imitation of the Unix shell, common utilities, and I think pipes, subject to the limitations of an 8 bit CPU, 64K memory, and lack of multitasking. I saw nothing better until I started playing with Minix 1 as part of a grad school class, and then found a used copy of IBM Xenix 1.0 at an amateur radio flea market. The last saved me a lot of time when learning to handle C pointers and references without the need to reboot at every mistake.

Choices

I'll grant Powershell has a certain amount of advantages if you're running 2008R2/Windows 7 or later with the certainty it's built in, or when needing to perform administration tasks closely coupled with some Microsoft products. However, for other instances, why would I bother using something extremely platform specific when there's the opportunity to use Python or a wealth of Unix derived utilities to solve the problem, and not tie my skills to a particular OS?

I've done some scripting, read the documentation and some of it is extremely well designed and rather neat. For most purposes, though, I'd still rather use Cygwin, Python or C++.

Re: piping

The piping in DOS was also a nasty kludge; it did not support true pipes. It would write the ENTIRE stdout from the first command into a temporary file, then only when this was completely written out, open the temp file and feed it into the standard input of the next command. I.e.,

dir | more

would write the entire result of "dir" into a temp file, then open the temp file and run it into "more".

Re: piping

But that's what I'm saying. I've had cases of the pipe not working, probably because the second program tried to load after the first, couldn't, and DOS returned an error to that effect. Like trying to stuff a huge text file (~1MB I think) through more.

idiosyncratic

A compromise between a useful shell and learning something which only has value on windows, a platform of declining relevance to the world of servers, is bash. Git for windows comes with bash and it's nice to have one common shell as I move across os x, linux and windows. powershell looks very interesting but I have not been able to justify learning it yet. For more advanced admin python on windows works well and once again there is not a new learning curve. I wonder if the new Microsoft would have done something as idiosyncratic as powershell.

Not so impressed with Powershell

Maybe this has since been resolved, but I remember being underwhelmed by my first major outing with Powershell.

The task was to replace some aging VB scripts that communicated with Exchange Server 2003 via CDO. The exchange server was being replaced with Exchange 2010 which of course no longer supported CDO in favour of Exchange Web Services (EWS).

So we selected Powershell to use the EWS API. We ended up with a custom C# snippet in the Powershell script that implemented an accept-all certificate handler to work around the connection errors we were experiencing between the script and Exchange (as advised by MSDN and MS blogs).

Oh and of course make sure for the love of god that you selected the right number of bits (32 versus 64) when you executed the PS script. And that you selected *exactly* the right version of the EWS API DLL to download and deploy, otherwise PS just threw an exception.

All this just to read certain subject lines from emails in an Inbox. Conclusion: Even in 2014 the tools felt like a poor beta.

(Oh yes, code signing PS scripts with the cmdlet. Sometimes it just wouldn't. But take the script + code signing cert to a similar workstation and then it worked. Weird!)

Just looking for a little Active Directory administration. How do I do it? Oh, here are some examples: In powershell I do it using exactly the same ADO objects I've been using for the last 10-15 years.

Yeah, just watch out for the icacls command - I wrote an all-in-one create-user script last year, and after an hour or so of aggravation, I gave up trying to escape the colons and parenthesis and just stuck it in a batch file that I called from the create-user script. There are "native" ACL commands in Powershell that are powerful, but they are even uglier than a batch file and take too much coding - or at least it seemed like a lot of code just to replicate the functionality of a single line call to