Linux/Unix/Cygwin/Bash.exe

Just kidding about the path. I'm wondering if it's appropriate to ask a question about the bash shell in here, even though it's used in Cygwin (a windoze-based Unix os). It is a reversing-related question since I'm reversing Paimei, using softice, to help me see why it wont work on my system. To help me with the dense Python code, I am building a debug version of Python. Since many of the Python makefiles, in wxWindows and wxPython, are Unix based, I loaded Cygwin, and had to dig into the unix os to understand what was going on.

I had the path working really well in bash, but it went AWOL. If I use 'printenv PATH', it only shows the Cygwin path, and not the Windows path. Before, it showed the windows sys path, then the windows user path, then the Cygwin path. This is important to me because I need the Visual Studio path at the front. Otherwise, Cygwin tried to use the gcc compiler, since it's first in the path.

I'm using this scripted path statement in bashrc in my home directory:

The 'cygdrive' references, for anyone unfamiliar with Cygwin, are Cygwin's way of mounting drives in the Windows system. I don't get the warning message at the bottom and I don't get the windows path ($PATH).

I had to put the Visual Studio path at the front because bash no longer appends the Windows path at the front. Can anyone see why? The 'do' statement covers that, and it should append the $path (windows path).

Another problem peculiar to Cygwin has to do with the usr/bin directory. I Googled this problem and saw references to it but no solutions. If I go into /usr or usr/bin, I can see files in there using ls -l. In windows, I can see the directories, but no files show up. In fact, in a Cygwin shell working out of a windows directory, bash can't find files in usr/bin. I'm wondering if that happens in Linux/Unix as well.

It was mentioned that /usr/bin is being linked with /bin in the Cygwin root and that the solution was to remount /usr/bin, but I don't have enough info to have the confidence in that solution.

Here's my bash config files. They should be easy to understand.
To install, just copy the content of the config directory in your home directory. Or, better, make symlinks to the files and directories there, so you just need to keep the config dir and not a dozen of things in ~.

To modify your path, just edit the file .bash/path if you want you can also add a path by making a symlink to it in the .cdpath/ directory.

Btw, if you create a file named HOSTNAME.(...).sh (hostname being the name of your machine, (...) being whatever you want: aliases functions, etc) it will be loaded automatically by .bashrc, this let you keep different config for various machines.

Thanks Relica. Your alias files are impressive. It will take me a few days to digest all this, but it gives me ideas on how to use basic Unix commands.

I'm not sure if the answer to my disappearing /usr/bin files is in the info you supplied. If I look at /usr/bin in Cygwin with a Windows file manager, I see nothing. If I look at the same directory under Cygwin, using ls -l, I see many files. If I look at /bin, in the Cygwin root, with a Windows file manager, I see all the files

Thanks Relica. Your alias files are impressive. It will take me a few days to digest all this, but it gives me ideas on how to use basic Unix commands.

You're welcome. Don't be afraid to use and abuse the man and info commands to learn more. You might also find interesting the bash guides available here: http://tldp.org/guides.html .

I'm not sure if the answer to my disappearing /usr/bin files is in the info you supplied. If I look at /usr/bin in Cygwin with a Windows file manager, I see nothing. If I look at the same directory under Cygwin, using ls -l, I see many files. If I look at /bin, in the Cygwin root, with a Windows file manager, I see all the files

This is because it's cygwin.dll that map them there from the cygwin installation directory. Every cygwin apps see them because when they try to access the file system they rely on cygwin.dll. Just use the explorer to look at your installation directory, and you'll find everything there.

A good place to start looking for infos about cygwin is of course their website, check:
- http://cygwin.com/faq/faq-nochunks.html
- http://cygwin.com/cygwin-ug-net/cygwin-ug-net.html

I hope this clarify things a little.
Relica

I promise that I have read the FAQ and tried to use the Search to answer my question.

The 'cygdrive' references, for anyone unfamiliar with Cygwin, are Cygwin's way of mounting drives in the Windows system. I don't get the warning message at the bottom and I don't get the windows path ($PATH).

You might want to have a look here about the cygdrives:http://cygwin.com/faq/faq-nochunks.html#faq.using.accessing-drives

Another problem peculiar to Cygwin has to do with the usr/bin directory. I Googled this problem and saw references to it but no solutions. If I go into /usr or usr/bin, I can see files in there using ls -l. In windows, I can see the directories, but no files show up. In fact, in a Cygwin shell working out of a windows directory, bash can't find files in usr/bin. I'm wondering if that happens in Linux/Unix as well.

Indeed, both /bin and /usr/bin link to the bin directory where you installed cygwin. No, it's not usual on linux to have /usr/bin linked to /bin but it may happen. Why not try linux? http://www.ubuntu.com/

I hope this helps,
Relica

I promise that I have read the FAQ and tried to use the Search to answer my question.

This is because it's cygwin.dll that map them there from the cygwin installation directory. Every cygwin apps see them because when they try to access the file system they rely on cygwin.dll. Just use the explorer to look at your installation directory, and you'll find everything there.

A good place to start looking for infos about cygwin is of course their website, check:

thanks again for the url's and info. What you say about cygwin's dll makes sense and that would explain why a windows file manager can't see files in /usr/bin. However, bash can't see files in that directory either if it's operating out of a cygdrive mounted windows directory.

For example, I was working out of /cygdrive/i/<mydir>. Bash was looking for the gcc compiler in /usr/bin and claiming it could not find it.

With respect to cygwin infos, I have spent hours reading through manuals and docs.

Indeed, both /bin and /usr/bin link to the bin directory where you installed cygwin. No, it's not usual on linux to have /usr/bin linked to /bin but it may happen. Why not try linux?

thats makes no sense to me, which leads into my resistance to learning Linux. BTW...you'll be sorry you asked that question. I have tried several times, most recently with Red Hat Fedora 7. Let me try to explain that without insulting any Linux users. By the same token, I'm open to criticism or advice, even if it's from Delta or JMI to quit posting bombastic replies.

I'd love nothing more than to see Linux take over from Windows. I don't think there's a chance of that happening till the Linux crowd gets together and simplifies the OS and the documentation. It's way too complex for the average user to learn, and only someone who is driven by a hatred of Microsoft, or some other strong passion, could afford the time and energy to overcome the difficulties.

I realize Linux has gone a long way in that direction by introducing X-windows and making the installation more user friendly. There's still a long way to go, in my humble estimation. There's a saying that too many chefs spoil the broth. That's the problem I see in Linux. When I try to read Linux documentation, I get irritated by the assumptions and the jargon. Most of what I read presumes an understanding of what came before. When I studied engineering, one of my first lessons was that a drawing or document should be self-explanatory. It should stand on it's own with no support required from other documents or drawings.

What the heck am I doing asking questions in a Linux-based forum?? I'm still trying!! I finally found a project that forced me to follow through, and that has helped. But it hasn't reduced my frustration much with learning Linux.

Please don't take this as an insult to your interest in Linux, I'm only expressing my frustration. I'm an old gaffer, in my 60's. I grew up in the era when Unix was king of the hill, and I don't miss that era one bit. You can't begin to imagine how archaic it was, with teletypes and punch cards for entering data into a compiler. Even in the early '80's, when I worked as a computer tech, it was archaic. Hard Drives were 18" platters and held 5 megs of data. RAM was 4k (not meg), and that was on a minicomputer.

When I look at the Unix/Linux command line system, it sends shivers up my spine. It was designed for teletypes and punch card readers. I have tried to immerse myself in it but I keep asking myself why I'm doing it. A good parallel is the retro music industry. I also grew up with primitive electronic musical equipment, like analog synths which were monophonic (Moog) and the Fairlight, which was way ahead of it's time. Today, the Fairlight is a fossil, yet many young people are drawn to that technology, for whatever reason.

The same applies to vacuum tubes. I used to repair tube amps as a technician. Many people today claim tube amps are superior to solid state amps, but from the perspective of a tech who understands both, I think that's a load of hooey. Today's solid state amps are far superior to any tube amp ever built and their clarity is so superior that people mistake that for bad sound. What is heard in a tube amp is distortion, which adds fundamentals to the original sound, or filtering due to the whopping output transformers, making it sound warmer. You can do exactly the same thing with a solid state amp by adding effects, or using FETS, which don't cut off as sharply as BJT's.

I can't help getting that same retro feeling about the Linux/Unix OS's. That's purely personal, I realize, and is based on my past experiences. I am attracted to Linux because it's free and not put out by Microsoft. Also, I appreciate the packages available with Linux, like the gcc compiler, and the abilty to compile kernels, etc. Those are all bonuses. I even liked the KDE desktop, although I'm not quite sure why. I just like it. It was Cygwin that rekindled my interest in Unix, mainly because I could jump back and forth between it and Windows.

You can keep the command line interface, however. I realize DOS is bottom end stuff, but it's all I ever need. If I want to to view a directory in DOS, I type dir. If I want it page by page, I type dir /p or dir|more. I have tried ls | more in both linux and cygwin and I find it's more miss than hit. What's with that?

I just typed ls | more in /usr/bin under cygwin and got:

/bin/more: line1: $'/r': command not found

followed by another 30 lines of similar errors.

I understood what you said above about /usr/bin being linked to /bin, but why couldn't bash figure that out? Both /usr/bin and /bin are in the PATH, but bash messes it up. I 'know' there will be a reason, but why is it so darned complex? If an error like that occured once in a while, I could endure it. With Linux, it seems par for the course.

I would think it would simply flag an error, as in DOS. Instead, it is trying to do something else because it obviously doesn't understand the command. That's where I think DOS, albeit simplistic, is ahead of Unix for the average Joe. It deals only with it's command interpreter and if it doesn't understand it says so. Unix is so complicated it chokes if the input is not precise. For a university academic, that's fine. For Joe Blow, coming to Linux from DOS, it's far too complex, and unnecessarily so.

Why would the designers of DOS, Linux or Unix have a directory listing that runs the files by on the screen so fast that no one can read them? I'm sure there was a reason in 'the day', but why today? With the 'ls' command however, there doesn't seem to be an easy way to run it page by page without using a pipe. When you use a pipe like 'more', it leads to problems at times. DOS uses the command interpreter and it has certain basic commands in it. No matter what directory you're in, if you type 'dir' it will execute it, or flag an error. That doesn't seem to be true with the bash shell and it's probably because Unix doesn't use a directory structure like DOS. Everything is a file, which makes no sense to me intellectually.

I'm reluctant to give Bill Gates credit for anything. To me, he and Microsoft became successful despite themselves. They were in the right place at the right time. When they designed DOS, which I think was a ripoff of CP/M, they had all the advantages of knowing the pitfalls of the Unix system. In the early days of DOS, it was somewhat lacking, but for the average JOE, it was adequate. In those days, the claims made that Unix was vastly superior to DOS were true, but today, it's a moot point.

Command lines have gone the way of the dodo bird. Only a masochist works with the command line. OK...I know they are necessary at times. I've been living on the command line recently, trying to build a Python debug version and trying to run makefiles from the Cygwin command line. I had completely forgotten about running the C compiler from the command line, however, because I hadn't done so for 20 or more years.

Please don't talk to me about emacs or vi. Like I said, I was around when they were invented. I was at university studying engineering in the mid-70's and studied Fortran as part of computer science. There were no personal computers, and anything done was on mainframes. When we wrote Fortran programs, we went through a convoluted process of data entry. First, we had to enter our programs line by line into an interpreter, which spat out punch cards. Then we took your cards to the Holy Grail, which was never seen, and humbly inserted our cards into a card reader. Then we bowed, as in the Seinfeld Soup Nazi skit, and waited. As often as not, we'd get back a printout claiming a syntax error and we'd have to start the process anew.

Emacs and vi remind me too much of that archaic system. They were obviously written under the constraints of teletypes, which were going through their death throws in the early '80's. There's no reason I can see to write files of any kind under a command line app. Anything you need can be done in windows environments. There's no need to fiddle with the complex nature of emacs or vi, unless you are constrained to command line Linux or Unix.

Please don't take this as an argument favouring DOS over Unix. Even I can see that Unix is way ahead of DOS as an OS, but I don't think it's way ahead of Windows NT. I'd be perfectly happy to carry on using XP if it were not for the stupidity and arrogance of Microsoft. I get so angry at them, over nonsense like Wizards (which are really Dummies), that I could spit. Microsoft always seems to find a way to hancuff the user, presumably because they think we're too stupid to figure things out. So, they take away the functionality of helping ourselves. Then there's the bs of designing an OS (Vista) to appease the Nazis in DRM.

The main problem I have in making the leap from Windows to Linux is the hassle of converting to a Linux system. Under Windows, I can find almost any app or driver I need to perform a function. I know Linux has come a long way, and I see the name Ubuntu a lot. I'm not sure that I can find the drivers I need under Linux to run printers, sound cards, etc. And how about running softice, or IDA?

I'd like to hear from anyone, like yourself, who has expertise in this.

Hehe, your ramblings and persistent reversing adventures are always welcome contributions WaxfordSqueers, don't worry. I just hope that relica and other knowledgeable and helpful Linux professionals won't take any offense from such a post, but rather see it as a valuable feedback/insight from a typical non-hardcore although would-be Linux user. I would indeed like to see Linux become bigger too, and for that to happen some changes, based on feedback like this, is most likely needed. Ubuntu is indeed a step in the right direction.

And relica, we really appreciate to have more Linux professionals hanging around here, so I hope you will find it enjoyable to stick around. And please feel free to bring some friends too, to discuss even more Linux related stuff.

"Give a man a quote from the FAQ, and he'll ignore it. Print the FAQ, shove it up his ass, kick him in the balls, DDoS his ass and kick/ban him, and the point usually gets through eventually."

I just hope that relica and other knowledgeable and helpful Linux professionals won't take any offense from such a post, but rather see it as a valuable feedback/insight from a typical non-hardcore although would-be Linux user.

that's exactly how I meant it. There was absolutely no disrespect intended, rather a frustration which is probably due more to my past and perhaps to a misunderstanding of the Linux/Unix OS's. I could have kept my answer much shorter, but I was hoping someone else might benefit from my frustration (or maybe I was revealing my stupidity).

I really appreciate Relica taking the time to share his knowledge and his bash shell. That helped a lot. It was also very helpful that Relica has expertise in Cygwin. I highly recommend that package to anyone using Windows who would like to compare both Windows and Unix live.

It's very handy to view a Unix directory (sorry...file ) from a Windows file manager. I quickly learned the difference between a DOS-based text file and a Unix text file (DOS uses 0D 0A as a line terminator, and Unix an 0A) and how UltraEdit converts betwen the files easily).

As an example, Visual Studio chokes on a project file, written for a Unix system, when it converts an earlier variety to the current type. Running all the dsw and dsp project files through Ultraedit, and saving them as DOS types, makes them readable by VS.

Thanks for tip but I have already used that one. I was looking for '.make' files and couldn't see them in the directory. After scratching my head a bit, and scaring small birds nesting in there, I came across the -a option. It reveals the '.' files as far as I can see.

The problem I'm having is peculiar to Cygwin, I think. The directory /usr/bin is linked to root/bin (maybe I'm using root in the wrong manner) through the cygwin dll. When a Windows file manager tries to read it, it doesn't use cygwin.dll and can't see files in /usr/bin.

One thing I found cool about Unix was the 'alias' command. You can redefine a command like ls -al as la, making it a lot easier to type. Read through relica's bash file...it's full of little tricks like that. If the aliases are defined in a startup script, they are always available.

... and only someone who is driven by a hatred of Microsoft, or some other strong passion, could afford the time and energy to overcome the difficulties.

I'm using linux since 1994-95 (before chicago). You learn on the way. There's no need for studies about linux. You start. You bitch about it. You make your way to do with it. You make the rule. I'd be surprised if it was not the same thing with windows... Maybe it's not the OS that changes, but the way your brain interpret it.

I realize Linux has gone a long way in that direction by introducing X-windows and making the installation more user friendly. There's still a long way to go, in my humble estimation.

Yes, indeed. X was there from the very begining... windows managers changed the bet (KDE, gnome,...). But I must admit that linux isn't ready for the desktop.

There's a saying that too many chefs spoil the broth. That's the problem I see in Linux. When I try to read Linux documentation, I get irritated by the assumptions and the jargon. Most of what I read presumes an understanding of what came before.

You can not expect that a man page about "ls" will explain you tcp or udp protocols.

When I studied engineering, one of my first lessons was that a drawing or document should be self-explanatory. It should stand on it's own with no support required from other documents or drawings.

I'm quite sure you didn't explain how to make concrete, i don't know how to make concrete, and if i was reading your paper i'd probably never know. I need to make some researches on my own.

What the heck am I doing asking questions in a Linux-based forum?? I'm still trying!! I finally found a project that forced me to follow through, and that has helped. But it hasn't reduced my frustration much with learning Linux.

The very first time i tried linux, it was a slackware distro, one of the first. I've found myself, alone, in front of a prompt and every command i typed was resulting in something like this: "bash: dir: command not found.". Every body was thinking i was good with computers but that night i realised that i knew nothing. And i started to learn.

I'm an old gaffer, in my 60's. I grew up in the era when Unix was king of the hill, and I don't miss that era one bit. You can't begin to imagine how archaic it was, with teletypes and punch cards for entering data into a compiler. .

I can imagine i made some programs with my father on really old IBM machines back in the '80, I wrote on on punch cards, i was even able to read them, and even to understand to program behind them.

When I look at the Unix/Linux command line system, it sends shivers up my spine. It was designed for teletypes and punch card readers. I have tried to immerse myself in it but I keep asking myself why I'm doing it.

Perhaps because over 70 percent of the market is running with various unixes and you'like to get $70,000 a year as a system administrator... ?

A good parallel is the retro music industry. I also grew up with primitive electronic musical equipment, like analog synths which were monophonic (Moog) and the Fairlight, which was way ahead of it's time. Today, the Fairlight is a fossil, yet many young people are drawn to that technology, for whatever reason.

Fairlight is still alive! http://www.fairlight.to/ ;-)

I can't help getting that same retro feeling about the Linux/Unix OS's. That's purely personal, I realize, and is based on my past experiences. I am attracted to Linux because it's free and not put out by Microsoft. Also, I appreciate the packages available with Linux, like the gcc compiler, and the abilty to compile kernels, etc. Those are all bonuses. I even liked the KDE desktop, although I'm not quite sure why. I just like it. It was Cygwin that rekindled my interest in Unix, mainly because I could jump back and forth between it and Windows.

It's not about free versus non-free. We are reversers, we don't care about copyprotections. Last time i reversed an application it was an ftp client for the european parliament. The fact is, if you use unix (solaris, etc.), you've got the sources for everything. You don't like something? Don't complain, and write it in another way!

You're free to do whatever please you.

You can keep the command line interface, however. I realize DOS is bottom end stuff, but it's all I ever need. If I want to to view a directory in DOS, I type dir. If I want it page by page, I type dir /p or dir|more. I have tried ls | more in both linux and cygwin and I find it's more miss than hit. What's with that?

Let me guess you never tried to set the DIRCMD environment variable? And Pirates With Attitude means nothing to you?

DIRCMD=/P/W/A/O:ends

If you need to manage files try:

mc

That's a norton commander clone, that came just from the era you were talking about.

I understood what you said above about /usr/bin being linked to /bin, but why couldn't bash figure that out? Both /usr/bin and /bin are in the PATH, but bash messes it up. I 'know' there will be a reason, but why is it so darned complex? If an error like that occured once in a while, I could endure it. With Linux, it seems par for the course.

It's not the job of bash to figure that out. You have to tell it where it should look. It's the same for windows. Try to run a command which is not in your PATH envariable. Like "nero", it doesn't work, yet, if you add to your path something like "c:\program files\ahead\nero" it will run nicely.

I would think it would simply flag an error, as in DOS. Instead, it is trying to do something else because it obviously doesn't understand the command. That's where I think DOS, albeit simplistic, is ahead of Unix for the average Joe. It deals only with it's command interpreter and if it doesn't understand it says so. Unix is so complicated it chokes if the input is not precise. For a university academic, that's fine. For Joe Blow, coming to Linux from DOS, it's far too complex, and unnecessarily so.

Don't talk about DOS if you don't know NDOS or 4DOS, please. And 'bash' isn't trying to do something behind the scene. ZSh would try it, but not bash, it's the stonehenge of the shells...

Why would the designers of DOS, Linux or Unix have a directory listing that runs the files by on the screen so fast that no one can read them? I'm sure there was a reason in 'the day', but why today? With the 'ls' command however, there doesn't seem to be an easy way to run it page by page without using a pipe. When you use a pipe like 'more', it leads to problems at times. DOS uses the command interpreter and it has certain basic commands in it. No matter what directory you're in, if you type 'dir' it will execute it, or flag an error. That doesn't seem to be true with the bash shell and it's probably because Unix doesn't use a directory structure like DOS. Everything is a file, which makes no sense to me intellectually.

Now i'm wondering if you're just a troll and waste my time. I wanted to help you, but now i think you don't even read the manual. It's a PBKaC. Just for the sake of your understanding, it's not 'bash' job to show errors, it's the program you're running that should give back an error code.

I'm reluctant to give Bill Gates credit for anything. To me, he and Microsoft became successful despite themselves. They were in the right place at the right time. When they designed DOS, which I think was a ripoff of CP/M, they had all the advantages of knowing the pitfalls of the Unix system. In the early days of DOS, it was somewhat lacking, but for the average JOE, it was adequate. In those days, the claims made that Unix was vastly superior to DOS were true, but today, it's a moot point.

Please, you do not know DOS, do not talk about CP/M. I used CP/M.

Command lines have gone the way of the dodo bird. Only a masochist works with the command line. OK...I know they are necessary at times. I've been living on the command line recently, trying to build a Python debug version and trying to run makefiles from the Cygwin command line. I had completely forgotten about running the C compiler from the command line, however, because I hadn't done so for 20 or more years.

It looks like i'm a masochist, not necessarily, but just because when you work all your afternoon to get a document finished, i just do it in less than 80 characters... I love people that make the whole job then i can take the credit for it.

And i like SM. With a girl i'm confident with.

Learn ASMembly on x86.

Please don't talk to me about emacs or vi.

UltraEdit! http://www.ultraedit.com/ if i have nothing else vim is good enough. But both vi and emacs sucks, they're just a pile of shit. Before UltraEdit i was using an even better text editor (back in the '90ies') that was called aurora. http://www-personal.umich.edu/~knassen/aurora.html

The main problem I have in making the leap from Windows to Linux is the hassle of converting to a Linux system. Under Windows, I can find almost any app or driver I need to perform a function. I know Linux has come a long way, and I see the name Ubuntu a lot.

I'm not sure that I can find the drivers I need under Linux to run printers, sound

Try it again. Now. Forget what you know and learn new things.

And how about running softice, or IDA?

IDA is running fine, even ollydbg runs fine via wine. But what is the point to reverse applications, when everything is free?

I'm not sure to understand you.

If you have a problem with bash, just ask, you're welcome. If you have an ethical problem with yourself, just go see a psychiatrist.

I promise that I have read the FAQ and tried to use the Search to answer my question.

Your reply started out well and I was impressed that you took the time to break down my reply and give intelligent responses. I appreciated that. Then you got stupid. You questioned if I was trolling. All you had to do was look at my 350+ posts to see if I was a troll. But you saved your built in angst for the end, with this statement:

***I'm not sure to understand you.

If you have a problem with bash, just ask, you're welcome. If you have an ethical problem with yourself, just go see a psychiatrist.***

This forum is run by decent, intelligent people and the users don't operate that way. We don't flame each other. We rib each other, mildly insult each other at times, but it's all in good fun. You were obviously bothered by my long post regarding my frustration with Linux, but instead of asking me to clarify what I meant, you jumped to disparaging conclusions.

I made it clear in a reply to Delta that I was in no way trying to insult you or any other Linux user. Where you got the troll from, or the suggestion that my reply made me mentally unbalanced, is a product of your mind, not mine.

I'll give a civilized response to you, but I'm sure you wont be around to read it after your negative comments at the end.

Originally Posted by relica

I'm using linux since 1994-95 (before chicago). You learn on the way. There's no need for studies about linux. You start. You bitch about it. You make your way to do with it. You make the rule. I'd be surprised if it was not the same thing with windows... Maybe it's not the OS that changes, but the way your brain interpret it.

the question I posed was "why am I doing that?". Unix is an old language dating back to 1969, and was written for the constraints of the equipment in that era. That was my point. I've lived through that era and I don't see a reason for going back to it. I was hoping someone might provide a good answer. I'm not so stupid or so ungrateful that I'd rip someone who is helping me. You asked a question and I gave you a reply. There was no intent to knock Linux/Unix, as in a flame. I simply told you about my experiences and feelings about it, hoping you or someone else might convince me otherwise.

Yes, indeed. X was there from the very begining... windows managers changed the bet (KDE, gnome,...). But I must admit that linux isn't ready for the desktop.

that is the kind of answer I was looking for, because it has been my experience too.

You can not expect that a man page about "ls" will explain you tcp or udp protocols.

of course not, but it might explain why the ls command wont dump data one page at a time. What were the original designers thinking about? Did they expect you to read the files as they rushed by on the screen? I don't think so. They knew about piping it through 'more', but there's no reference to that in any Linux docs I've read.

every command i typed was resulting in something like this: "bash: dir: command not found.". Every body was thinking i was good with computers but that night i realised that i knew nothing. And i started to learn.

why did you persist? What was your motive? Was it because you thought Linux was a superior system, or did you just want to be different? That's what I'm trying to understand...why do I want to learn Linux, as opposed to just using Windows. You already said it's not ready for the desktop, which has been my experience too. I had already loaded Fedora 7, being about the 5th time I'd loaded a Linux distro. It just sat there. I had no use for it other than poking around to see how it worked. Everything I needed could be done in Windows or DOS.

Perhaps because over 70 percent of the market is running with various unixes and you'like to get $70,000 a year as a system administrator... ?

that hasn't been my experience on the west coast of Canada. I work in the electrical/electronics field and that takes me into many office environments. I have not seen a Unix system yet. Everything is run off an NT Server.

I can see universities running Unix. or laboratories, then again, universities are notoriously conservative. They still teach that electrical current flows from positive to negative in electrical engineering, even though a graduating engineer must design for a practical world in which it was agreed long ago that current flows negative to positive.

Fairlight is still alive! http://www.fairlight.to/ ;-)

this is a youtube demo of the fairlight by Herbie Hancock and Quincy Jones:

http://www.youtube.com/watch?v=n6QsusDS_8A&feature=related

hitting the run button should play it, otherwise you'll need to search for it on the page.

This complex system was very high tech in it's days, and it still sounds decent. Only someone like myself, who has the electronics/music background, realizes what a fossil it now is. I wouldn't take the entire package if someone gave it to me.

It's not about free versus non-free. We are reversers, we don't care about copyprotections. Last time i reversed an application it was an ftp client for the european parliament. The fact is, if you use unix (solaris, etc.), you've got the sources for everything. You don't like something? Don't complain, and write it in another way!

you have obviously misunderstood the intent of my long reply. I was not writing to complain about Linux/Unix, I was expressing my frustration at learning it, and wondering if the effort was necessary.

Europe has a different way of doing things, and a different economic environment than North America. Over here, corporations like Microsoft get away with questionable tactics both in full view and behind the scenes. Many people feel powerless to do anything about corporate power that is so worshipped by many of our politicians, and the only way to hit back is to develop an open-source (free) equivalent. Having something be free in North America can be a big deal.

Let me guess you never tried to set the DIRCMD environment variable? And Pirates With Attitude means nothing to you?

if I am following you, you're talking about a DOS environment variable. I have used DOS since it's inception and I hardly find a need for it anymore. I've forgotten more about DOS than I remember. I have never needed to define my path other than to enter it in my autoexec.bat or config.sys file. XP has an entirely different way of doing things.

It's not the job of bash to figure that out. You have to tell it where it should look. It's the same for windows.

the PATH is defined in bashrc and it appears when I type 'printenv PATH'. Both /usr/bin and /bin show up in the printenv PATH results.

ZSh would try it, but not bash, it's the stonehenge of the shells...

thank you...I think that's what I've been trying to say.

Now i'm wondering if you're just a troll and waste my time. I wanted to help you, but now i think you don't even read the manual.

the problem is much deeper than that. We have a language difference and we are communicating in an artificial medium. Things are being misunderstood...there is need for patience and understanding.

Please, you do not know DOS, do not talk about CP/M. I used CP/M.

Why, why, why??? It was used on the Z80, which I believe came form the Pleostine era.