I don't think nitpicking, saying "this is just a slider, you can't patent it" or "ms copied all concepts from xerox and apple lisa" is rational in this situation. Because this patent is not about these subjects. Just look at the bigger picture: Corel copied Office suite apps almost pixel by pixel. I would be pretty angry after seeing this to screenshots:

This seems like an entirely reasonable design patent to me. There are lots of ways of drawing such a slider; the patent does not cover any function, merely the ornamental design.

Now, the idea that this should entitle Microsoft to all of Corel's profits for the entire product is clearly absurd; as the article points out, that's the current legal precedent but is being appealed. Lawyers are hardly going to not take advantage of precedents which favour them; nor would it even be good if they did -- the fastest way to overturn bad law is to apply it strictly and make obvious its failings.

"If Corel is found to infringe even one of Microsofts design patents through even the smallest part of Corel Home Office, current Federal Circuit law entitles Microsoft to all of Corels profits for the entire product. Not the profits that can be attributed to the design. Not the value that the design adds to a product. All of the profit from Corel Home Office."

With such a crap "design patent" on a generic slider, Microsoft tries to extinguish another Office competitor?

" For example, Samsung explains that under the Federal Circuits ruling, profits on an entire caror even an eighteen-wheel tractor trailermust be awarded based on an undetachable infringing cup-holder. "

At some point companies are just going to stop doing business in the United States. I get that it's the worlds largest economy, but little by little this nonsense will fix that.

The purpose of A Patent is to allow the inventor to openly share their idea without the fear of stealing. The inventors are also allowed/encouraged to financially profit from their idea so they can continue invention.

a silly patent (even by design patent stds), but the complaint illustrates that this patent is a pretty small part of a [smallish] thicket of patents on the ribbon concept, as well corel pretty brazenly trying to capitalize on the Office UI (whether that is a legit claim or not is up for debate) back when MS was pushing it (and of course MS allegedly meeting with Corel to "resolve" it after finding out). personal opinion is that it may have been a compelling claim in 2007/2008, but now just gives you that slimy feeling again. another good fact (probably irrelevant) is that microsoft is now (and has been) encouraging people to use the ribbon metaphor (as far as i can tell).

Putting aside whether Microsofts design was actually new and not obvious in 2006 (when Microsoft filed its application)

UI sliders were certainly not novel in 2006. And I'd be surprised if Corel did anything other than use the stock slider in Microsoft's UI library. Maybe they implemented their own and it looks too much like Microsoft's?

>Putting aside whether Microsofts design was actually new and not obvious in 2006 (when Microsoft filed its application), whether Microsoft needed the patent incentive in order to come up with this design

Honestly as stupid as this (and many others) are, it's the fault of the patent system, not the companies. The companies are reacting rationally to the system as it's been created, i.e. patent everything possible so you have more patents covering more things than your competitors. The only reason patent troll companies exist is because some smart investors realized you didn't need to do the risky bit, make products, to hold and enforce patents.

The system needs to be fixed, and then shit like this will go away naturally.

I get how we all intuitively want the PTO to not issue "stupid patents" but is there an objective, mechanical way to determine what is stupid? Do we care more about false positives (as in this case) or false negatives (a hypothetical alternative universe where too few, rather than too many, patents are awarded).

Now, before you start arguing against patents entirely, what I really want to know is: without patents, how are individual supposed to profit from their inventiveness when a larger entity could trivially copy the idea and profit based on their superior connections and capitalization? And yes, I'm talking about software. Are software inventors just supposed to give away their inventions as open source and feed themselves by working for someone else who a) got the ask, and b) defended their IP?

I'm anxious to see what angle the Microsoft supporters use to justify this truly stupid patent. They proclaimed Microsoft had changed LOL - Microsoft is still the same old Microsoft. They're still using old prior art ridden patents to shake down companies and either make them sign ridiculous patent licensing terms or take them to court in a battle of attrition.

I've been using Wagon (wagonhq.com) lately and have been really impressed by it. I enjoy having a universal interface regardless of whether I'm connecting to postgres or mySQL. It's awesome to be able to visualize the results of a query with a couple clicks. There are a lot of little details like auto-complete which make it more fun to use than other editors I've tried in the past. I'm eager to try SQLTabs too and see how it compares.

Looks cool.I've tried most clients and been dissapointed - PgAdmin was very unstable on mac, object browser is good when it works but query editor is poor - Postico is pretty but has very limited functionality not suitable for large complex scripts.The best solution I found was to use Sublime Text with it's build system you can run scripts and get results easily, it has amazing text manipulation features, snippits are powerful, it's rock solid and blazingly fast. http://blog.code4hire.com/2014/04/sublime-text-psql-build-sy...

Hell yes! Intel chips are about to get exciting again. SGI put FPGA's on nodes connected to its NUMA interconnect with great results. Intel will likely put it on its network on chip with more bandwidth and integration while pushing latency down further. 90's era tools that automatically partitioned an app between a CPU and FPGA can be revived now once Intel knocks out those obstacles that held them back.

Combine that with OSS developments by Clifford Wolf and Synflow in synthesis that can be connected to OSS FPGA tools to show even more potential here. Exciting time in HW field.

Economic historian Niall Ferguson beautifully describes fractional reserves as one of the greatest innovations of humankind in Ascent of Money.

The miseries from bank runs and crashes are real, but even so, the total impact on human welfare from fractional reserves has been overwhelmingly positive.

How do we know?

There were holdouts in Europe originally, some central banks were slow to adopt fractional reserves. They were trounced by the economies of other European nations and their superior access to liquid capital.

We've essentially done the laboratory tests here, it's one of the few areas where econ plays like a hard science. We know the answer here, and it's not 100% reserve requirements.

It's a tragic topic to put to a referendum though, because I don't expect every random person to be an expert on the technical nuances of financial history. And this is a topic where intuitions are a pretty poor guide, one where I'd expect the wisdom of crowds to fail hard. PhantomGremlin's quote from George Bailey is a good example. Most people aren't bankers, and don't really understand how it works as a system, beyond their own account, so it's weird to ask them to make key decisions here.

There is a website [0] and book [1] on this topic, "Up until about forty years ago a good percentage of the money in circulation was produced by the Bank of England and the seignorage went to the Treasury. Today 97% of all the money in circulation is created as debt by the banks and the seignorage profit goes to them. The result is that between 2000 and 2009 the state has foregone a trillion pounds. How many public services could that have funded? ... Modernising Money shows how a UK law implemented in 1844 can be updated and combined with reform proposals from the Great Depression, to provide the UK with a stable monetary and banking system, much lower levels of personal and national debt, and a thriving economy."

A survey [2] was done as part of a master's thesis at Zurich University, which found that, "Only 13 percent know that private commercial banks provide the majority of the money in circulation. However, 78 percent of the Swiss population would like money to be produced and distributed solely by a public organisation working for the common good, such as the National Bank. Only 4 percent preferred the system we actually have today that money is mostly created by private, for-profit companies such as commercial banks."

I don't really understand how this isn't a proposal to effectively end lending.

If banks are required to hold 100% reserves against their deposits, where exactly are they supposed to get money for lending?

The article says "theyll only be able to lend money that they have from savers or other banks" but that seems inconsistent with the 100% reserve requirement. If I'm required to hold on to 100% of deposits, I can't lend them out.

Can someone provide an explanation of how this works or provide a reference with an overview? The article is light on details. I assume this means preventing banks extending credit against deposits that don't necessarily exist?

It is important to distinguish between what we call "creating money" which is really fraudulently introducing multiple claims on the same monetary unit at the same points of time (and which remain unexercised until a panic) vs. loaning money, which can be done perfectly safely so long as the money being loaned is available for the duration of the loan.

So, when a bank issues an honest loan for a million dollars for 10 years, it needs to have 1 million dollars pledged to it by savers for at least ten years (in the form of CDs or what have you.) This money may be spent in the economy and thereby returned to the bank as a deposit of some duration, which can be loaned against again safely, so long as it is reloaned under the same constraint: the bank must ensure that any loans made against the re-deposited money are done in a shorter term than the deposit is for. You can have very rapid loan growth in this manner, as time horizons expand during growth phases, and suffer none of the issues with multiple parties claiming the same monetary unit at the same time, commonly called bank runs.

Money can be safely "fractionally reserved" so long as intentional duration mismatch (which is really fraud) is not allowed. That's the core issue in the banking system. The production of the underlying money is a separate question, but I'm less and less convinced it matters all that much, as long as it isn't insane.

If I have a windows machine or VM, I simply don't run anti-virus. There's no point. At Kiwicon last year, some French researcher showed how most anti-virus scanners were so badly written, he could exploit their scanning engines with basic malformed PDFs and JPEGs. Most of those scanners run as the SYSTEM user, so you basically can control a system with a PDF.

...but I hesitate to tell non-developers to uninstall their anti-virus. I don't want to be responsible for them getting exploited, but I usually do tell them why I don't run anti-virus and that the choice is up to them.

I always emphasize the biggest thing you need to do as far as security goes is to run all updates. Never skip or delay updates. The moment Chrome/FF wants you to restart, you restart them. Run Windows update (even though Windows 10 is another beast/debate entirely, if you chose to run it, you should run updates).

At one point it looked like Microsoft was going to kill the scammy Windows "security" industry by releasing their own anti-virus. But then they backed-off and now MSE seems to be purposefully curtailed.

I remember using AVG many years ago when it was a decent product. I recently had the displeasure to have to install it again. AVG Free right now is malware, plain and simple. It highjacks your home page in every browser, changes your search page, and silently installs an extension. And if you go and switch the home page back, it shows you a popup asking you to set it back to AVG. This is pure malware behavior.

I'm primarily a Windows user on the desktop/laptop side (though I do also use a lot of Linux/Unix/embedded systems) and my advice to everyone who asks (as the token 'IT advice guy' to lots of friends and family) is just don't install anti-virus software. Modern Windows is better off without it. As far back as XP the best option was to install Microsoft's own Windows Defender and uninstall everything else, now just use what the OS already comes with.

Microsoft's goal with virus elimination is to make Windows work better, 3rd party vendor's goals with virus elimination are to upsell you on a lot of crap you don't need. It isn't difficult to see why the 3rd party stuff is all a bunch of crap that floods you with false positives while bogging your system down in an attempt to seem like it is doing something useful.

Yes, there are occasionally exceptions to the rule, but they all eventually follow a logical progression from useful lightweight tool to bloated piece of shit that is worse than most viruses they could possibly save you from.

I'm currently taking the Coursera course "The Global Financial Crisis" [1] (from Yale) and quite enjoying it. Their answer to what caused the crisis is bubble thinking from all involved, not moral hazard on sub-prime loans or government failure. They back their finding up various studies, which seem convincing to me (but I'm a beginner, so don't take my word for it).

Mathematically speaking though, Michael Berry's success may have been more to do with luck than skill:

1. Nowadays, options are priced in such a manner that the expected value of buying OTM hedges is almost always negative. This is due to the inflated volatility for out-of-money put options and other factors.

2. It just so happened that the subprime mortgage meltdown coincided with his career; for the period between 1935-2007, his strategy would not have worked. It also would not have worked between 2008-present. (maybe it worked have worked in 1987, so we're talking three instances out of a century). Right now, the evidence suggests bank have reformed their lending practices, with higher credit scores for new homeowners, so the odds of another financial meltdown are slim.

3. The fed balance sheet is big, but interest rates are very low.

The fed has posted a large profit , a 30% gain in 2015 for a profit over over $100 billion, which is sent back to the treasury

Short-term interest rates would have to rise rapidly to quite high levels in the neighborhood of 7% for the Feds interest expenses to surpass its interest income. Such an outcome appears very unlikely, the paper said. In the event that the Fed did face a loss, it could simply hand no money back to the Treasury and, in the most extreme case, future remittances would also be reduced (and recorded as a change in deferred credit), but the Feds capital base and financial position still would remain completely secure.

Japan has a much bigger debt, they seem to be doing fine. Low interest and srong dollar is due to reserve currency status, flight to safety, emerging market weakness, commodity weakness, petro dollar, the large size of the US economy, and other factors.

So what's the next crisis? actually read the article and this article does not really expound upon some sort of next crisis.. I guess the crisis is that interest rates are so low?? awesome click-bait apparently..

"If The Big Short, Adam McKays adaptation of Michael Lewiss book about the 2008 financial crisis and the subject of last months Vulture cover story, got you all worked up over the holidays, youre probably wondering what Michael Burry, the economic soothsayer portrayed by Christian Bale whos always just a few steps ahead of everyone else, is up to these days"

"The next crash looks to be housing-related. Fannie Mae is in trouble. But not because of their accounting irregularities. The problem is more fundamental. They borrow short, lend long, and paper over the resulting interest rate risk with derivatives. In a credit crunch, the counterparties will be squeezed hard. The numbers are huge. And there's no public record of who those counterparties are.

Derivatives allow the creation of securities with a low probability of loss coupled with a very high but unlikely loss. When unlikely events are uncorrected, as with domestic fire insurance, this is a viable model. When unlikely events are correlated, as with interest rate risk, everything breaks at once. Remember "portfolio insurance"? Same problem.

Mortgage financing is so tied to public policy that predictions based on fundamentals are not possible. All we can do is to point out that huge stresses are accumulating in that sector. At some point, as interest rates increase, something will break in a big way. The result may look like the 1980s S&L debacle."

In 2006 I wrote:

"Interest-only loans as a percentage of new loans. (Graph) (Data from Loan performance)

People with these loans are not homeowners. They're renters, with an option to buy. One interest-rate spike and they're out on the street."

It took longer for this to crash than I expected. Mostly because the Fed pushed out cheap money to delay the crash. I was expecting an interest rate spike, but that didn't happen. I also didn't expect the thing to cascade so badly; I though the problem was going to be comparable to the 1980s S&L mess, not worse.

A key number to watch is the ratio of median house price to median income. Historically, that runs around 2.7. Above 3, trouble begins; people can't make their house payments. In the last bubble, it passed 4 for the US nationwide, and 10 for California. For that graph, see [1]. Click on "Price to Income". Look at the line for "United States". It's about 3.3 now; it hit about 4.1 before the last crash, then dropped to 3.0. Look at the graphs for California cities; they're much higher. But not as high as last time, yet. Keep watching those numbers; they're a leading indicator of a housing bubble/crash.

The problem with speculating on this is market timing. You can see the stresses building up, but it's hard to tell when the system will break. Those way-out-of-the-money option strategies hit this - you lose money every year as you wait for the big event. Burry almost ran out of time and money that way. Taleb, the "black swan" guy, doesn't release the results for his Empirica hedge fund for years other than the one year he hit the jackpot. His people got really upset when someone leaked them, they made it into Wikipedia, and he didn't look so good.

(On other fronts, I expected peak oil to be a bigger problem than it was; fracking fixed that, for now. And I thought Bitcoin would collapse long ago; instead, its use case for getting money out of China made it valuable.)

There are also diffs adding lambda support, tweaking various classes for compatibility with applications that use reflection to access internal capabilities, and fixing lots OpenJDK compatibility bugs.

Android still needs to run dex bytecode somehow, so there are two possibilities for how N will work.

Option one is that Android stick with ART and replaces Harmony with OpenJDK: from a technical perspective, that wouldn't be the end of the world, especially since the Harmony implementation is rather inefficient.

Option two is that Google ports Hotspot to run on Android and, then has PackageManager convert dex bytecode back to Java bytecode on device. That would be awful, since ART is built for low-end devices and, well, Hotspot isn't.

In favor of option one is that Google is still developing ART. In favor of option two is Oracle being Satan incarnate.

I also wouldn't be surprised if Oracle has compelled Google to simply ship a copy of Hotspot, allowing developers to ship "authentic Java" APKs instead of dex-bytecode ones, with the two environments running in parallel, with two different zygotes.

To explain if you are just joining in: This pretty much means Oracle v Google, a case with major ramifications for the industry has been settled out of court. I don't see how this can be interpreted any other way.

The commit references ojluni. This relates to "luni" in Android source which stands for lang util net io. Sounds like there are plans to replace the harmony implementation with OpenJDK one. License differences are definitely interesting but could be also related to performance and completeness - luni is fairly small set of classes whereas ojluni import brings in a ton more.

But what I don't understand is why they're importing the full AWT API! That's nuts.

In the context of the recent juniper attack where some unauthorized code was committed without anybody noticing for years, it seems like it would be easy to hide a backdoor in such a big commit.

How do you go about checking the integrity of the code when you have so many files?

8902 files were changed, most added, and the commit says it's just importing openJDK files. Is there anybody checking that the source file imported haven't been modified to include some kind of backdoor?

Its exciting to see all the goodies from java sound finally coming in. I could care less about cobra though Im just glad we get better support for audio coding from java without having to resort to jni and the native layer in c/c++.

This isn't a great representation of the issue. Everyone loves to hate the drug companies.

First of all, the basic science is lacking. The vast majority of animal models in no way resemble the normal process by which humans develop cancer. The models are designed to grow aggressive cancers extremely fast so you can test drugs rapidly, publish your paper and get the next grant. Even more than that, what actually takes place in the years before a cancer becomes clinically relevant is quite speculative. So a long term commitment to funding prevention research is needed from governments and funding bodies, because pharma can't fund this kind of basic science.

The other problem is that to do a good prevention trial, you need to identify people at risk. We aren't really that good at doing this (eg we screen ALL women above the age of 50 for breast cancer, we screen EVERYONE above the age of 50 for bowel cancer). There is some debate about whether screening in this manner is actually as worth wile as it has been made out to be, because it leads to over diagnosis without a survival benefit. So you have to be careful not to design an intervention that only prevents cancer in those who weren't going to be affected by it anyway. This is why you can't use surrogate end points - at the end of the day, for prevention, mortality is what matters. Are we going to give hundreds of millions of women drug X and make them suffer niggling side effects for 20 years because it reduces the rate of diagnosis of breast cancer? Definitely not - overall mortality is what matters (and quality of life).

It's not even a given that you can actually identify people at risk - some researchers think cancer is just bad luck (ie it is not predictable at all, or the predicting factors are unknowable). Even if this is wrong however, cancer risk probably involves weak effects from hundreds or even thousands of factors. We are very far from working all of this out.

A drug is probably not what we want. Actually a drug is really not what we want. They are too expensive and too much effort to distribute and monitor. Drugs also always produce health inequalities within and between nations. A holistic approach to improving our health and well being would be preferable, at least incorporating nutrition, exercise and psychology. Unfortunately the nutritional, exercise and psychological sciences are not hitting home runs in that regard, which is not surprising because they miss out on the billions we spend fighting an incredibly wasteful war on cancer.

It is hundreds of different disease related to mutations in the function of cell division. Cells that don't need to divide, start dividing but the daughter cells are not the same as the parent.

In order to treat Cancer with a drug, you need to identify which of the hundreds of subtypes it is and give the right drug. Either that, or give people cocktails of lots of drugs and hope that you can identify the right one before the other ones kill the patient.

At this point there do not appear to be any magic bullets that will cure cancer, just lots of work and research to chip away at the problem, one subtype or one patient at a time.

And cancer appears to be inherent in what we are, animals. It seems unlikely that we will be able to change what we are sufficiently to avoid cancer. No vaccines are likely.

On the other hand, all the investment in Cancer research is paying off and is chipping away at this family of diseases. There are lots of reasons, not just new drugs, but some new drugs are part of the solution.

Maybe the human race is at a point where we have solved all the easy problems and now have a hard slog to chip away at the rest of them, bit by bit.

A drug to treat cancer only has to be taken by people diagnosed with that cancer, for as long as they're diagnosed with it. A drug to prevent cancer has to be taken by everyone at risk for the cancer (which might be the entire population), for as long as they're at risk (likely decades). Hence, a treatment drug can be expensive and can have some nasty side-effects, while a prevention drug would have to be cheap with minimal side-effects. That's a much tougher constraint to work under.

Well, preventing Cancer has been a priority for evolution since the first multicellular organisms arose. It's just not that easy.

Developing a drug is an economically difficult thing: Spend billions for the chance to win billions. The only alternatives to big pharma in developing these drugs are big government and big charity (Gates' Foundation et al).

"Theres more money to be made investing in drugs that will extend cancer patients lives by a few months than in drugs that would prevent cancer in the first place.

Thats one of the findings from the work of Heidi Williams, an M.I.T. economics professor and recent MacArthur Foundation genius grant winner, who studied the problem along with Eric Budish, a University of Chicago economics professor, and Ben Roin, assistant professor of technological innovation, entrepreneurship and strategic management at M.I.T."

I'm damn near 40, and the phrase "there's no money in a cure" has been around as long as I can remember. Heck, Chris Rock did a whole bit about it. [0]

Is this just putting good science behind that phrase? Is there anything new here?

The same, often perverse, incentives operate on the potential treatment of aging. Lots of work on tinkering with end states, meaning age-related disease in advanced stages, and next to nothing on prevention, meaning periodic repair of the molecular damage that causes aging.

That makes it work well for reliability and data durability. I have never had it corrupt data over the years. Can always pull the plug on the server (or kill -9 the process) knowing it will recover to a consistent state, or you can also take filesystem snapshot at any time and get a constent database image (I've used that for backups).

"Haldeman, Joe The Forever War. 1974, Ballantine. An interstellar war is fought using black holes for travel between battles."

I've read Forever War half a dozen times. Black holes aren't a focus of the novel, but rather the effects of relativity on soldiers fighting far flung battles. By the time soldiers reach a battle, everyone they know on Earth is dead and several generations have passed due to time dilation.

"More or less accurate science", with rather significant error towards "less" in many cases.

Now, don't get me wrong; I love SF, and will happily suspend disbelief for wildly implausible fictional technologies (as long as an author doesn't then go on to flagrantly violate the physics of their own universe out of sloppiness or as a plot device).

That said, for example, I'm not entirely sure what part of current science suggests that flinging a spaceship at a black hole ("collapsar") at relativistic velocities will cause said vessel to pop out of another black hole, near-instantaneously, elsewhere in the galaxy. ("The Forever War", as much as I love you, I'm looking squarely at you.)

I happened across this book, which seems to be little heard of and really enjoyed it. Premise is great and the science seems solid for the most part. Author seems like he tried hard to stick with things that are not denied by the laws of physics as we know it.

So the past week and half I have been on holidays back in Italy (I live in Germany). At least here, in the South, your status still greatly depends on 1) if you managed to raise a family (single 40 years old ultra rich Corporate managers don't get much credit); 2) if you have a good network of friends (and people here don't look down at blue collars). My father was a blue collar worker and he gets a lot of respect for doing well 1 and 2.

While this is not ideal (I think 1 and 2 are part of the reasons many Italian lack the ambition Americans have), it's a very good way to live a very decent life as an individual. At a macro level I believe pushing on 1 and 2 will kill your economy, on a micro level (aka if I was a shrink), I would strongly advise them.

P.s. of course, 1 and 2 work well in a world where you have USA, where people do kill themselves to deliver loads of innovation to the whole planet (see: internet, space exploration, etc.).

P.p.s. of course USA works well because it attracted talent across the world to deliver loads of innovation (see: immigrants)

If someone leaves highschool "not suited" for academics then their highschool has failed them. No student should feel they are in any way 'not the type' for higher education. No 16/17/18 year-old should be so pigeon-holed.

How many of these kids are making these decisions based on third-hand accounts of what higher education actually entails, without even sitting in on a post-highschool lecture? How many are told not to bother simply because they don't look/act/sound the part?

Given the state of highschool education, the lack of basic literacy, nearly everyone would do well to attend some sort of 'academic' post-secondary education. Reading books, listening to lectures and writing about what you have learned teaches the basic communication skills useful in every field.

At least in Europe there is and has been for years a chronic shortage of tradesman (alleviated by migration). You will not earn as much as a doctor but the jobs are there for those who want them. I think the complaints stem from the fact that there are simply not that many labourer jobs (and that's a good thing).

I attempted to find a company I know and 6 searches later I still couldn't find it. The first two attempts I had part of the name wrong, I.e. x resources instead of x energy, but I corrected that, continued, and still couldn't find it even though it came back with dozens of results.

Google got it on the first attempt even using the wrong name.

Too much data with poor search algorithms can be worse than a phonebook.

A great resource in that I suspect much of this data was otherwise buried deep in some other database...but is also an example of how essential filtering/ranking algorithms are for search, at least for general user friendliness.

For example, doing a search for Apple, and even limiting it to California, brings up a ton of junk that needs to be filtered out visually by the user:

For me the most interesting thing here is the rejection of direct compilation to native code via LLVM. This team has concluded that the safest and most practical way to target Apple's platform is using one of Apple's compilers (in this case, clang).

I'm not sure if I agree with that conclusion. After all, RubyMotion and the Elements Compiler (www.elementscompiler.com) have kept up with all the new compiler requirements from Apple, including the requirement to emit bitcode for watchOS and tvOS. So clearly it's possible. And using LLVM directly, rather than using C as an intermediate representation, yields more control over debug information. Still, it's possible that at the next WWDC, or even next September, Apple will drop something new that will force us to use one of their compilers.

...and they won't be the last, as more and more later-stage "startups" begin to show they won't ever make the returns their valuations would suggest, and start begging for cash as they burn through their existing investments and meagre revenue.

>The North Sea is small, but its cultural influence spread from Dublin in the west to Poland in the east, says author.

well, even further east - Kievana Rus, the starting point of Russian state, was founded by Vikings as kind of forward operating bases in their dealings with Black Sea and Eastern Mediterranean. Before Vikings, it were just some Slavic tribes in the forest, and under Vikings it became a small state making raids south on Constantinople and to Caspian Sea/Iran. Basically it was put on the map of Europe, and the rest is history :)

Anyway, Vikings and Caliphates (as well as Roman Empire before) are great examples how climate changes affect, sometimes to the point of really driving, the history.

What's the open source driver situation like for these things? Is it worth supporting nvidia by buying one of these, or are they likely to crack down on developers to keep this thing closed? It is an awfully interesting device, but I'd hate to buy one and find I could only really use it as an android device, especially after reading reviews about that lackluster functionality.

To dovetail off another HN thread (Free Springer math books)...after reading Bostock's piece, I suddenly realized that the classic Grammar of Graphics by Leland Wilkinson he quotes might be free...and it is! The 1999 version anyway:

I personally make the better part of my living off being able to do custom D3 visualizations with some degree of speed and flexibility, so thank you Mike for your continued contributions and making it free for the rest of us. I'm seriously looking forward to D3v4 ;)

What you see is a set of charts built with D3 under the hood and placed on grid layout with a mix of configuration settings and basic control structures. What's unusual in this approach is that it introduces end-users to programmable visualization through a simplified DSL. It helps users step outside of GUI editor sandbox and yet it doesn't expose them directly to JavaScript and SVG.

I'd appreciate your critique of this approach in general, not necessarily our implementation of it :) Is this level of "tyranny" acceptable in your view?

1. There is no great javascript library for manipulating data, like pandas for python. Are you interesting in starting one as part as a d3 module? This could benefit from d3 being a standard and from your experience in writing javascript code.

2. Do you regret the original .enter() .exit() functions? These were powerful, but were a barrier for entry because they were hard to grasp. Do you wish you'd gone full functional from the start?

3. How do you get paid? Are you sponsored? Are you looking for sponsors?

1. Genuine candidates: I'm going to quote the extreme, people like Satya Nadella, Sundar Pichai, Vinod Khosla etc. came to the US on H1B and there are a lot of other "superstars" on H1B or naturalized citizens. More practically, there are thousands of other talented people from all countries who work on H1B. So the argument that H1B should be done away with is absurd, unless the US doesn't care about staying atop of the technology sector. Also, H1B workers get paid the same as local workers; there is a minimum wage requirement that has to be satisfied by the company applying for the H1B. So if you buy the allegation that wages are being depressed by H1B workers, why don't you take a peek at the bulletin board of your break room (which have H1B applications with salaries posted) or online and lodge a complaint with the state department of labor rather than posting bigoted comments?

2. Candidates hired by "staffing" companies: Blanket applications made by these "staffing" companies which may or may not have actual jobs. These are the applications that need to be stopped/scrutinized by USCIS (they have tightened this during the recession 2008-2010). It's arguable there is a genuine requirement for staffing companies, but these need to be scrutinized thoroughly.

Yet the article conveniently uses cherry picked data for arguing against H1B. The majority of the H1B are being awarded for above categories and not for electrical engineering.

The simple explanation is that it's cheaper to manufacture components in Taiwan, Korea and China. H1B is not causing electrical engineers to lose jobs, it's the fundamental economics of production that are stacked against the occupation.

In response to a comment claiming outsourcing was the issue, I responded with this and thought it relevant to post here:

FWIW, the very cap on H-1Bs plus the decreasing attraction of coming to the US is the very reason ALL companies are going overseas.

I once did a research on outsourcing and the pivotal reason I found for outsourcing was not the "cheapness" of the foreign markets, but the decreasing flexibility and opportunity to continue development at home, and this decrease was dominated mainly by inability to find people to do the work. It may seem hard to believe but it isn't once you add up all the pieces. While factory workers may seem abundant and easy to find, finding upper-level factory operators across various disciplines from IT to Management to Operations was very difficult. Conversely, the two places where they were the easiest to find was China and India in that order. After all, who wants to work in a factory when they grow up anyways?

So what do companies do? They go above and beyond plus bend backwards to make overseas operations feasible. This problem is not just limited to factory workers, or electrical engineering, but any industry that has a hard time finding good quality workers. My industry of civil engineering is probably going to go next.

I have been an EE for 20 years. I think one of the things driving it is the increase in productivity. I can do a lot more with my RF/Microwave design software than 20 years ago, so I am now probably doing the work of two people.

There is also consolidation with complexity. How many 5G chipsets do you need when 95% of the phone market is held by two manufacturers? Hardware is expensive to develop and takes time. Investors/companies want cheap and fast, and are unwilling to fund R&D.

People with Engineering / Comp. Sci. Degrees are not in short supply. Exceptional engineers and programmers are. Its almost like the difference between kids who want to play football and the people with talent to make it.

I wonder if this is a side-effect of the current silicon-valley investment scene. There are lots of low-capital startup opportunities, so who wants to fund the high-capital startups in computer hardware? There have been very few computer hardware startups the past few years, and that might have an effect on overall innovation and growth.

I optimistic about such issue. I will say software is good occupation while everything is blooming, but you can be more secure during the down time if you are an EE. Furthermore, you can train yourself to be a good software developer if you next to a computer and have internet access(of course you need dedication), but you cannot be a good EE without knowing how to operate a very expensive equipment(i.e PNA, MOCVD, etc). Disclaimer: I am a software developer with a background in EE

H1Bs should have to be paid at least 20% more than the local market average for qualified workers. If people with their qualifications are truly THAT scarce, then businesses should have to pony up more than the going rate given their specializations.

IT/CS industry is the major consumer of H1B visas. IT/CS is also the industry that relies mostly on outsourcing. Guess how healthy this industry is, guess which computer major pays the most salary in USA today and so on and also the consumers are happiest with the laptops, iphones, apps and Uber.

Now compare it with the heavily regulated industry of Healthcare. No one likes paying hospital bills in USA, almost everyone works on wafer thin margins and a constant fear of law-suites. Only the doctors and that too only few of them manage to earn a fortune but everyone else is worse off. Far more importantly USA has very inefficient and expensive healthcare system.