while (true) {

I'm going to make myself look foolish here, and I admit I never took a full scope of CS courses, but I feel like programming jargon has taken a turn in the last decade to make it significantly less comprehensible.

So perhaps this is due to the point being discussed above--more programmers than before are the products of CS programs, so they learned the jargon (to some degree), so they use it more often.

But there's a difference between usage and understanding--many people understand Shuggy but they don't drop the same terminology into their own posts. He's one of the main functional-programming aficionados in the forum so it comes up more often.

Kind of interesting that this is coming up at the exact same time as we are having a debate about using i as an index, which is as mathematical a habit as you can possibly get.

In fact, I'd say that math culture, which contrary to popular perception is actually rather relaxed about notational rigor, is in direct tension with (rather anal) engineering culture in the programming community.

If you're using a modern shell and not tab completing all of your arguments, then you pretty much deserve what you get.

Eh, I suspect some C&P would be involved. So maybe it's trusting documentation/coworkers that's really the culprit. Besides, tab completion doesn't solve every problem - it likes rm -fR /path * just as much as it likes rm -fR /path/*!

In fact, I'd say that math culture, which contrary to popular perception is actually rather relaxed about notational rigor, is in direct tension with (rather anal) engineering culture in the programming community.

I wish to scream when people supplying functional definitions of the form:

Code:

foo a b c d e = Some one liner curried form of some other function re-ordering or manipulating the arguments

I am not joking. That really happened (though function not called foo). I wanted to cry.

No docs either, I swear mathematicians self obfuscate. I'm just glad they haven't started inserting some of non ascii characters...

Structs are supported, but you have to be careful when using them in collections, and provide your own equality comparer. The no-jitting rule of monotouch seems to trigger an issue when instanciating Comparer<T>.Default where T:struct. The workaround is to consistently provide your own comparer on structs.

There are other surprises occuring at runtime on devices that pop-up due to the lack of JIT, but to Xamarin's credit, there is a trend where those issues are being caught upstream with each release.

If you're using a modern shell and not tab completing all of your arguments, then you pretty much deserve what you get.

Eh, I suspect some C&P would be involved.

If you can't copy and paste the escapes, you definitely deserve what you get. bash prefers escaping to quoting, so you have to actively go out of your way to make this happen. I don't know about other shells.

If I had USD 1k to spare per target platform (iOS, android and Mac) their visual studio plugin would be worth it.

We have it for work (not a personal license). It's not.

ronelson wrote:

Let's not forget users who call programs directly and leave out the quotes. And I'm not just talking about the lusers, we've all done it, too. Sometimes it can have interesting results, if the program has a little too much trust in the user.

The last time I did this was around 1996 because tcsh on that machine didn't tab complete (and I was 8). I haven't done it since getting a bash or zsh shell, because all paths are completed with tab completion. At this point it's habit.

The no-jitting rule of monotouch seems to trigger an issue when instanciating Comparer<T>.Default where T:struct. The workaround is to consistently provide your own comparer on structs.

No probs, since they used to not have the special casing in there (I forgot they fixed it with on the fly code gen) it meant the default one boxed like crazy[1] if you ever used it so we always avoided that anyway.It may have only been for enums, I can't recall it's so long ago I dug into it.

I'm certainly used to having significant amounts of code gen happening for performance though. That might not follow through into muck about projects though.

I'd really rather do it largely in f# than learning Objective-C. Learning objective C has close to zero utility to my current job, f# does (as a means of keeping me from permanently rusting a bit round the edges with it). If I code in my spare time I'd prefer it to add some value to my day job, it pays the bills after all.

If you ruthlessly avoid currying[1], auto tupling of out parameters and some of the more esoteric closures it's not too bad. If you want to define records and the like you can use structs if they aren't recursive types (discriminated unions you can't regardless)

The issue is that you can't see some of the ones that are "obvious" in c#.

For hooking things up at the start and then not changing them it's pretty awesome, but yes I would be lary of putting it into the main loop directly without some decent profiling tools.

Of course this is all based on the MS implementation on the MS CLR. I believe the mono implementation is straight from the open source f# one, but the mono runtime (and/or the AOT compiler) might affect this considerably.

1. I dislike those over "tuple based" ones (actually regular .Net methods) as you can't overload them, which results in a binary compat nightmare

We're using Xamarin.Mac for a Win/Mac app and I have to agree. We're supposed to be done with our project in mid June, and will most likely be making our final build with an alpha release that includes a custom hotfix(!). That said, not having to deal with Obj-C for the UI code is wonderful. Xamarin is only like 60 people; it kind of feels like their products are too big/ambitious for their current size. Xamarin.Mac will most likely be fairly awesome a year from now.

I agree with you david_a, but I had such an awesome time with AppCode for the past week...

It's like using R# for obj-C. Once I realized categories + blocks could let me write code that was pretty close to what I am used to writing in .Net (lambda-heavy stuff), my productivity skyrocketed to levels way beyond what xamarin let me do.

Shuggy: F# on xamarin.iOS will never happen until Apple removes their 'no-jitting' sandbox.

Are you actually using any .NET code? Our use-case is fairly similar to Xamarin's marketing examples - most of the code is platform agnostic .NET, with only the UI layer and very bottom USB stuff needing special code. Some of the code we are reusing is close to 8 years old at this point. Our issues are mostly around polish. I think only the happy-path scenarios were well-tested when they launched 2.0, and it doesn't help that Xamarin.Mac is clearly a distant third in terms of focus behind iOS/Android.

Pretty much every tool out there. I have to think hard for one that doesn't. At the moment I'm actively using Netbeans, Bash, Maven, GHC, Hadoop, and a black pen and notepad and all of them can handle spaces in paths.

Pretty much every tool out there. I have to think hard for one that doesn't. At the moment I'm actively using Netbeans, Bash, Maven, GHC, Hadoop, and a black pen and notepad and all of them can handle spaces in paths.

Some tools breaks when used in a chain. Probably less of an issue with widely used tools and more with niche or homegrown solutions. Strips out \'s, passes the param to the next item in the chain which now treats it as multiple params, breakage. I wouldn't know how often that happens anymore, because I was trained decades ago to never use spaces in paths and it's stuck.

It's not "reality" in modern environments with okay--not even good, but okay--development practices.

Are we reading the same thread? The one where the current Oracle install barfs on directories with spaces in them? Note that I'm not accusing Oracle of having okay development habits, but those who use Oracle.

I guess it depends on how important you consider shell scripting skills for your developers, since that's one of the two major causes.

Were it legal I'd interview developers by dangling them by the ankle over an open volcano, with their only means of escape being the use of a standard GNU system to solve text processing problems and an entirely contrived problem that requires using xargs.

It's not "reality" in modern environments with okay--not even good, but okay--development practices.

I guess it depends on how important you consider shell scripting skills for your developers, since that's one of the two major causes.

I would imagine the other, which is C library calls like popen() and system(), is on a steady decline.

Not saying it's a good habit, but I use C/C++ and system() quite a bit because it seems more portable across linux/bsd and shell variants when I'm in 15 minute or 1/2 hour reusable tool mode instead of "shell script to do what I want to now, once" mode.

Improper use of system() is the issue, not its use in the first place (although it's not necessarily the best choice). When you write the call, you can make sure to handle white space inside arguments.

Are you actually using any .NET code? Our use-case is fairly similar to Xamarin's marketing examples - most of the code is platform agnostic .NET, with only the UI layer and very bottom USB stuff needing special code. Some of the code we are reusing is close to 8 years old at this point. Our issues are mostly around polish. I think only the happy-path scenarios were well-tested when they launched 2.0, and it doesn't help that Xamarin.Mac is clearly a distant third in terms of focus behind iOS/Android.

I have a nasty habit of doing dirty things with generics, dynamic types, and write a ton of lambdas in my C#, to the point where I should actually start writing F#. Writing C# as per the .Net 2.0 era style is not giving me anything in productivity terms, and I re-prioritized my appetite for platform-agnostic code. Shame, because I really liked what the MvvmCross guys did in terms of x-platform development.

Not saying it's a good habit, but I use C/C++ and system() quite a bit because it seems more portable across linux/bsd and shell variants when I'm in 15 minute or 1/2 hour reusable tool mode instead of "shell script to do what I want to now, once" mode.

It's more portable, but it's not safe.

Alamout wrote:

Improper use of system() is the issue, not its use in the first place (although it's not necessarily the best choice). When you write the call, you can make sure to handle white space inside arguments.

Proper use of system involves:

Never calling it from a multithreaded program

Never calling it from a setgid/setuid program

Never calling it from a program with more than stdin/stdout/stderr open, unless the extra FDs are set close-on-exec (non-portable & modern), or you're 100% certain that the called program can cope with their existence and will not abuse them

Coping with your SIGQUIT, SIGINT, and SIGCHLD handlers being overwritten for the duration of the call

Never passing a string that contains anything subject to shell interpretation, except spaces to separate arguments. POSIX mandates that the command be executed by /bin/sh, so if you can write a string that's perfectly safe for execution by plain sh(1), you can do that. I won't comment on the difficulty of doing so, across all of UNIX, nor on whether that rule is actually upheld by all implementations (I don't believe it is, but I can't find documentation either way).

The rules are slightly different on Windows too.

TL;DR, On UNIX, the only safe use of system is something like system("wget http://hardcoded.url") in a non-threaded, non-privileged, non-signal-handling, program with only stdout, stderr, & stdin open. Other platforms may relax those restrictions or place additional restirictions. Even then, your program will behave oddly if Ctrl-C is pressed during the download.

[edit]And those rules only apply to modern UNIX. There's additional mandatory funsies if you want to work on legacy UNIX.

(There are a few good reasons to use it, but not many relative to the number of Oracle users, if you get my drift.)

I think you lead a bit of a sheltered life at work. There are many apps that only support Oracle on the high-end. For instance, VCSA, which is important if you have any sort of large VMware installation (larger than 5 hosts/50 VMs). Sure, you can try running MS SQL, but if anything goes wrong you're on your own in every way, the vendor isn't going to care and your boss certainly isn't going to understand why you didn't stick with vendor-supported systems. I understand that's not important to you, because you do everything in public virtualization offerings, but not everyone is that privileged. So have your doubts, but Oracle is a simple requirement for a large number of people.

Now, that's not to say that many, or even any, of those people are happy with using Oracle. But not many of us are going to put our job on the line because we hate the Oracle installer with a passion. There are far worse applications out there, ones that you use regularly instead of just during one-time installs, if you're going to find things to risk your job for.

I have a nasty habit of doing dirty things with generics, dynamic types, and write a ton of lambdas in my C#, to the point where I should actually start writing F#. Writing C# as per the .Net 2.0 era style is not giving me anything in productivity terms, and I re-prioritized my appetite for platform-agnostic code. Shame, because I really liked what the MvvmCross guys did in terms of x-platform development.

Xamarin has a lot of promise, but they over-hyped their product.

I want to believe

I'm not familiar with the restrictions placed on the iOS/Android versions, but they have alphas using Mono 3 that will be finalized around the time of the evolve conference in mid April. That gives you capabilities far closer to .NET 4.5 than Mono 2 did (async/await, mainly).

(There are a few good reasons to use it, but not many relative to the number of Oracle users, if you get my drift.)

I think you lead a bit of a sheltered life at work.

I make technical decisions and aggressively avoid stuff that looks like it will ruin my day in the future. (Of course, the legacy shit at this company is on the Microsoft stack, which might be even worse...)

And, sure, there is software that requires Oracle--but I would bet money that the number of people who need it plus the number of people who have the sort of super-massive RDBMSes that benefit from running Oracle are probably kind of small compared to the number of Oracle-by-way-of-legacy-bullshit users. No proof, but a hunch.

LordHunter317 wrote:

Blacken00100 wrote:

with their only means of escape being the use of a standard GNU system

I make technical decisions and aggressively avoid stuff that looks like it will ruin my day in the future. (Of course, the legacy shit at this company is on the Microsoft stack, which might be even worse...)

I make technical decisions like that, too, but we have the weight of the legacy of decades of stuff to support on our back. My decision to use tool X today means it might be in production in 2 years, and there will be at least a year - possibly more - of overlap until whatever it is replacing is gone. If ever. We're still supporting some servers that went EOL and EOS 5+ years ago because of paying customers who won't get off them. According to the bigwigs, we're not going to give up 5 years of income because of a "mere" issue like lack of support. And it's kinda hard to argue with them, because its cost us maybe one trip to ebay for a few hundred dollars in all that time. Yay, Solaris 7!

On the other hand, I've been seriously thinking of hiring that guy with the chainsaw from the "What I learned Today" thread in the server room to come "fix" my servers once and for all.