To be more specific I used to program with Matlab (plotting with arrays). It was really simple with imagesc, plot like commands.

Working on gnuplot currently. I have seen a tutorial, which extracts arrays from .dat files and create plots. Currently I'm trying to extract the arrays from my output file (which is executable). Seems abit is abit hard, but I'm working on it :)

10-23-2012

Adak

Quote:

Originally Posted by Nooby 1 Kenooby

Thank you for the responses.

To be more specific I used to program with Matlab (plotting with arrays). It was really simple with imagesc, plot like commands.

Working on gnuplot currently. I have seen a tutorial, which extracts arrays from .dat files and create plots. Currently I'm trying to extract the arrays from my output file (which is executable). Seems abit is abit hard, but I'm working on it :)

Well, of course! ;)

This was made by the guys who helped make Linux, and they NEVER have done a clear, simple thing, in their friggin' lives.

Looks like a good tool, though!

10-24-2012

Nominal Animal

Quote:

Originally Posted by Adak

This was made by the guys who helped make Linux, and they NEVER have done a clear, simple thing, in their friggin' lives.

Are you deliberately trying to be offensive, or are you really just that ignorant?

The friggin' Unix philosophy is to do one thing, and do it well. KISS principle, you know. Gnuplot does that, and extremely well.

Stuff like Matlab, and basically everything you use in Windows, is exactly the opposite: big complex applications trying to be the end-all, be-all. Big, lumbering applications suites that just get more complex and less useful as time goes by.

I really feel like I just stepped into the bizarro land. Just because you're used to some weird-ass Windows contortions, does not mean it is the clear, simple way to do it. You're a slave, just refuse to see the reality of it.

Here's how simple Gnuplot actually is. Have your program output one sample (set of values) per output line. You can add comment lines if you start them with #. If you output a function (f(x)), print the argument (x) in the first column, and the function value (f(x)) in the second column. If you have multiple functions, just add more columns. For example,

If you have multiple functions in one file, you can omit the file name (just using two doublequotes, like the one above). u 2:3 tells Gnuplot to use columns 2 and 3.

If you want to compare your results to an algebraic expression, just go ahead: replace the file name and the use-columns with a function (using x), and it works.

For parametric curves, use the parametric mode. Here, this draws the same 2D spiral as above in parametric mode using 1000 parametric sample points in [0,50], then plots the data from the file on top of it:

Gnuplot can also do bar graphs, 2D surfaces, 3D plots, and add titles, modify axes, their labels, and so on, just as easily. Check out the links at Gnuplot help page for tutorials.

10-24-2012

Adak

After spending a lot of time reading through Linux stuff, I can't resist taking a swipe at GUN/Linux's loquaciousness. GNU didn't write Unix, Bell Labs did. (Thompson, et.al.). They wrote part of Linux, however.

I'm glad you took offense at it, and offered some examples, however. OK, if I goad you into offering another example next month, sometime? :cool:

10-24-2012

Nominal Animal

Quote:

Originally Posted by Adak

GNU didn't write Unix, Bell Labs did. (Thompson, et.al.).

Right. At the early days of computing, everything was open source (in practice, if not in principle). Then, some started closing stuff up. That was what gave Stallman the reason to start GNU and the free software movement; he wasn't really trying to start something new, just stop things from going much worse. (That's my understanding of the relevant history; reality and your view may differ.)

That was also the reason why Linux took off: the license made sure all enhancements and fixes to the contributors' work could be incorporated back in the future. I suspect Linus found the license to fit his intent best, not because of ideological needs or similarity, but simply because the GPL scratched the same itch for him that caused Stallman to start the movement in the first place.

(Let's not get into a GPL versus BSD flamewar here. I do believe that Linux is way more popular than any of the BSD variants because of the license, not because of some kind of technical detail. Whether you find that good or bad is your problem.)

The basic principles behind most GNU tools and utilities are still the same as they were in that original Unix scene, and that was what I referred to. In that sense the BSD and Linux camps are very similar. That should not be a surprise to anyone, though, because for example the KISS principle is really just common sense to anyone who creates on works on tools, be they programs or actual objects; something that has found to work, time and time again.

Quote:

Originally Posted by Adak

I'm glad you took offense at it, and offered some examples, however. OK, if I goad you into offering another example next month, sometime? :cool:

Bah, I'm way too easy manipulate/anger/fool. :redface: As long as it is for a good purpose, I won't mind.

However, if anyone is interested in some specific case, I'd be happy to show a complete detailed example. Just let me know the details of your data and plot requirements.

10-24-2012

christop

Quote:

Originally Posted by Adak

After spending a lot of time reading through Linux stuff, I can't resist taking a swipe at GUN/Linux's loquaciousness. GNU didn't write Unix, Bell Labs did. (Thompson, et.al.). They wrote part of Linux, however.

It's odd that you would call Linux loquacious. Usually people complain that it's too laconic. I would call Nominal Animal loquacious (in a good way), not Linux. But that's one of the features of *nix: programs don't say anything unless something is wrong. It's like that old saying: "If you can't say something bad, don't say anything at all." Contrast this with systems that like to tell you when something is successful. "Look at me! I just copied a file! 1 file(s) copied!" or "I just moved a file! 1 file(s) moved!". It's much harder to use such commands in scripts (or "batch files", as they're called on said systems). It's doubly bad when a system is inconsistent in its use of stdout and stderr. Warning and error messages (as well as success messages on those brain-dead systems) belong on stderr. Normal program output (that which you can expect to send to a file or another program) belongs on stdout. Guess where those "1 file(s) copied." messages are printed? stdout. Sane systems let you send stderr to /dev/null to avoid cluttering the output with unnecessary warning and error messages so they can handle the return status themselves.

Then there's some supposedly simple commands on those systems which turn out not to be so simple. Take the ECHO command, for example. This is a simple command that prints whatever you tell it to, right? Wrong. It has a double life as a command that does something completely different than this one simple task. ECHO ON and ECHO OFF (or any variation in capitalization) turn on or off command echoing, which is very much a different task than printing text to stdout.

It gets worse. This rule applies even when trying to print the contents of a variable. Say the variable "FOO" contains the text "On". The command "ECHO %FOO%" will turn off echoing, rather than printing "On". This simple command is not so simple now, is it? You have to use some bizarre syntax like "ECHO.ON" or "ECHO.%FOO%" (note the period) which I've never seen documented anywhere (I had to discover this strange "feature").

Believe it or not, it gets even worse. Now you're using the bizarre syntax "ECHO.whatever text" or "ECHO.%variable%" to avoid running ECHO's "evil twin" command (ECHO ON/OFF). Say the variable "FOO" contains the text "EXE", and a user happens to have a program called "ECHO.EXE". Now running the innocent-looking command "ECHO.%FOO%" runs the program "ECHO.EXE" instead of printing out "EXE".

There's no safe way to safely use ECHO to print user-supplied (or program-supplied) text. It's kind of like the gets() function in C (except that gets() was deprecated in C99 and removed from the language in C11, while ECHO has not been so much as deprecated, even though it should be).

Sadly, ECHO is not an isolated example of a simple command turned into an almost unusable abomination because it does too much or does things so haphazardly that it's even dangerous in some contexts. I've never seen "echo" or "cat" or "touch" or "tail" or "head" or "ls" or "mv" or "cp" breaking with any benign-looking input as CMD's "ECHO" command does, because they do one thing and do it well.

Quote:

Originally Posted by Nominal Animal

(Let's not get into a GPL versus BSD flamewar here. I do believe that Linux is way more popular than any of the BSD variants because of the license, not because of some kind of technical detail. Whether you find that good or bad is your problem.)

My understanding was that Linux grew while BSD was involved in a copyright case with AT&T in the early 1990's. The court eventually found BSD to be infringing AT&T's copyright in only 6 files (out of hundreds of files), and AT&T was even found to be infringing on some of BSD's copyrights, but it gave Linux some time to spread because people didn't want to get involved with BSD until the legal situation cleared up.

10-24-2012

Nominal Animal

Quote:

Originally Posted by christop

My understanding was that Linux grew while BSD was involved in a copyright case with AT&T in the early 1990's. The court eventually found BSD to be infringing AT&T's copyright in only 6 files (out of hundreds of files), and AT&T was even found to be infringing on some of BSD's copyrights, but it gave Linux some time to spread because people didn't want to get involved with BSD until the legal situation cleared up.

The lawsuit was settled in January 1994. The BSD share in the Top 500 machines steadily lost to other Unixes after that. Both Linux and BSD's gained about the same just prior to 2000. Then, while BSD's started very steady slow loss, Linux simply shot up.

I tried to find a similar graph for historical OS distribution for web hosts -- I think I've seen one somewhere already, just too bad with my Google/DuckDuckGo-fu to find a reputable one now, sorry. The trends there were a bit different, since the Top 500 machines are big investments, and thus tend to be a bit more conservative in their choices.

So, while I do not dispute what you said, I don't think the lawsuit tarnished the BSD's; everybody knew after the lawsuit that the *BSD's were in the clear, I think. And I also don't think Linux got "a head start" in the mean time, either, based on the graphs (and my own recollection). Just my opinion, though -- and I'm often wrong.
_ _ _ _

No requests for example plots, though? My Gnuplot-fu is itching for action.

10-24-2012

christop

Nobody can say with complete certainty either way about the lawsuit's effect on BSD after 1994, but the lawsuit almost definitely had a big impact on Linux's development:

Quote:

The lawsuit slowed development of the free-software descendants of BSD for nearly two years while their legal status was in question, and as a result systems based on the Linux kernel, which did not have such legal ambiguity, gained greater support. Although not released until 1992, development of 386BSD predated that of Linux. Linus Torvalds has said that if 386BSD or the GNU kernel had been available at the time, he probably would not have created Linux.

Besides, that Top 500 graph includes only supercomputers, for which Linux wasn't suitable until the late 90's anyway. A graph that shows usage on regular workstations and servers would show a better picture. (I do find it amusing that only 2 supercomputers out of the top 500 are running Windows. :))

It doesn't really matter to me one way or the other. I use various distros of Linux and have used (and enjoyed using) FreeBSD at home. I've also used SunOS/Solaris and HP-UX at a previous job. They're all very similar and generally cross-compatible at the regular user level (POSIX helps tremendously). Heck, I've taken some old C programs from the mid-80's from 2.11BSD (PDP-11 version of BSD) and compiled and ran them on a modern Linux system (x86) without any major changes (if any) to the source. One of those programs, a text-based Monopoly game, supported saved games by saving/restoring the process's own memory space to a file! I was surprised that it actually worked (or appeared to work) on a vastly different kernel since it's a major hack and not really portable (a better way would be to save the game state to a text file).

The big differences between Unixes are only at the administration level and quality-of-implementation issues (bugs, hardware support, etc).

10-24-2012

Nominal Animal

Quote:

Originally Posted by christop

Besides, that Top 500 graph includes only supercomputers, for which Linux wasn't suitable until the late 90's anyway. A graph that shows usage on regular workstations and servers would show a better picture. (I do find it amusing that only 2 supercomputers out of the top 500 are running Windows. :))

Yeah, I agree, just couldn't find one.

Quote:

Originally Posted by christop

It doesn't really matter to me one way or the other. I use various distros of Linux and have used (and enjoyed using) FreeBSD at home. I've also used SunOS/Solaris and HP-UX at a previous job. They're all very similar and generally cross-compatible at the regular user level (POSIX helps tremendously).

Same here. I even have an account on one SunOS 5.10 machine still. I really just feared this discussion would devolve into a license discussion.

Quote:

Originally Posted by christop

Heck, I've taken some old C programs from the mid-80's from 2.11BSD (PDP-11 version of BSD) and compiled and ran them on a modern Linux system (x86) without any major changes (if any) to the source.

Oh yes. Aside from specific Linux-isms like file leases or fanotify I sometimes use in my system-level tools, I do expect and try to keep my code portable between POSIX systems. That means they'll work in BSDs and probably Mac OS X, in addition to Linux and any legacy Unix systems. Freedom to choose, and having the variety, is a good thing.

Quote:

Originally Posted by christop

One of those programs, a text-based Monopoly game, supported saved games by saving/restoring the process's own memory space to a file!

A game I liked back in the day, Starflight I, did the same. I wrote an EGA/VGA emulator for its Tandy Jr graphics, actually (used Turbo Pascal and assembler for it), before someone hacked proper VGA support for it. (I did consider that approach too myself, but I don't like the approach; emulation was much more "my style"). The game is written in Forth, and the developers released the sources later on. Don't enjoy Forth, though.

Quote:

Originally Posted by christop

The big differences between Unixes are only at the administration level and quality-of-implementation issues (bugs, hardware support, etc).

Yes, quite. The same approaches and paradigms are used. Details differ, the general "attitude" or "approach" or "feel" does not. I suspect it is because the old choices were not ideological, they were practical, and are practical still. (In the current era, most people look for quick rewards, not the long term effects. Perhaps we've already lost something important there? Although I'm not sure if we've ever had a proper long-term view; maybe it is just that only the robust stuff has survived thus far in the turbulent but short history of computing?)

I could wax poetic about the benefits of true modularity -- another typical Unix and Linux approach --, and compare it against the prevailing framework mentality, but we're getting too far off the original point already.
_ _ _ _ _

Still, no gnuplot example requests? Darn. I was going to try to get to show something nice.