Posted
by
BeauHDon Saturday May 12, 2018 @12:16PM
from the sign-of-the-times dept.

Earlier this week, Carnegie Mellon University announced plans to offer an undergrad degree in artificial intelligence. The news may be especially attractive for students given how much tech giants have been ramping up their AI efforts in the recent years, and how U.S. News & World Report ranked Carnegie Mellon University as the No. 1 graduate school for AI. An anonymous reader shares the announcement with us: Carnegie Mellon University's School of Computer Science will offer a new undergraduate degree in artificial intelligence beginning this fall, providing students with in-depth knowledge of how to transform large amounts of data into actionable decisions. SCS has created the new AI degree, the first offered by a U.S. university, in response to extraordinary technical breakthroughs in AI and the growing demand by students and employers for training that prepares people for careers in AI.

The bachelor's degree program in computer science teaches students to think broadly about methods that can accomplish a wide variety of tasks across many disciplines, said Reid Simmons, research professor of robotics and computer science and director of the new AI degree program. The bachelor's degree in AI will focus more on how complex inputs -- such as vision, language and huge databases -- are used to make decisions or enhance human capabilities, he added. AI majors will receive the same solid grounding in computer science and math courses as other computer science students. In addition, they will have additional course work in AI-related subjects such as statistics and probability, computational modeling, machine learning, and symbolic computation. Simmons said the program also would include a strong emphasis on ethics and social responsibility. This will include independent study opportunities in using AI for social good, such as improving transportation, health care or education.

When we had the dot com boom we had a glut of computing students who graduated after the bust resulting in a glut of overqualified people with no jobs and buried in student debt. Anytime a university sets up a "fad degree" you know it's time to get out of the field. There is probably cloud and blockchain degrees as well.

I was in college after the dot com bust. Everyone and their grandparents switched from computers to healthcare. Healthcare became the new money major that guaranteed a high-paying job, if you didn't mind the ass wiping and bedpan swapping that went with it.

Look at a list of the world's biggest companies [wikipedia.org]. Seven of the top ten are tech companies, and five of those did not exist before the web. Those five have a combined value of $3 Trillion. So saying that AI is going to "fail just like the web" is a bit silly.

I suspect rather than being on the 'backburner' implying it will be dead, it'll just not require much investment and be 'boring' but still there. There is a fad aspect to it resembling the.com boom (There's a commercial trying hard to show the value of voice assistant + IoT by having someone feed there pet while driving around, which is really trying hard to solve a problem that pet owners don't have), but I don't think it's so burdensome as to warrant killing off even if the fad subsides and people stop

Those I think illustrate the point. Of the 'web native' companies that entirely (5 of them) only 2 of them were in the 'dotcom' boom (amazon and google) and google was a much less overwhelming business pressure. The other 3 web-native all came later.

In the fullness of time, the place for the "web" is clear and permeates our world, but in the dot-com boom, the vast majority of the industry went bust, because the technology had a lot of promise, but all these companies were blindly applying the technology

A lot of those companies invested exclusively in the web only presence or depended on other companies for content. Those companies went bust when the bricks'n'mortar companies created their own websites.

There was a company called getgooey who had the idea that they could allow users to run overlays over another companies website. You added their plugin into your browser and everyone could just add their comments through their servers. That never took on.

Right, similarly, companies that incorporate AI into their business will more likely have AI 10 years from now (some of those will fail to find an application for sure, but some of them will see success), startups that 'just do AI' are probably going to be gone within 5 years because it's the sort of function that will just be embedded into other businesses rather than be a business in and of itself.

I studied AI as my bachelor in 2009-2013 at Utrecht University. My study was competing with 3 other AI bachelor's programmes in a 100km radius. It's pretty rare for a university that's reasonably developed in technical fields to NOT offer a bachelor's programme on AI nowadays.

I studied AI as my bachelor in 2009-2013 at Utrecht University. My study was competing with 3 other AI bachelor's programmes in a 100km radius. It's pretty rare for a university that's reasonably developed in technical fields to NOT offer a bachelor's programme on AI nowadays.

Back when I was an undergraduate, it was more common for AI to be offered as a "concentration" within the computer science degree program if it was even offered at all. In those days it was really more of an area for graduate studies because frankly the field just wasn't as developed as it is now and we didn't know as much. However, in the years since we have seen the pace of research and discoveries in this area increase as the tools have finally become somewhat more equal to the tasks. It does not surpris

Back when I was an undergraduate, it was more common for AI to be offered as a "concentration" within the computer science degree program if it was even offered at all.

There was a Cybernetics and AI programme on the CTU FEE in the 1990s (with its own department). It seemed only logical that AI would be tied to cybernetics. Probably more than that it should be tied to CS.

I was thinking along similar lines before I read your post. I'm about to head back to school for my masters and put some thought into which area I wanted to study. Partly, I want to retire as early as I can, which means making good money first. There is a huge demand for AI professionals, leading to high salaries. It just doesn't interest me much, though.

In my case I think it's partially because I've been on a software quality kick the last few years. If aerospace engineering was done like software engineering, planes would crash every day. It doesn't have to be like that. We can do it right, the first time. The attitude of "it seems like it pretty much worked when I tried it, let's ship it" gets on my nerves.

While AI isn't exactly "it seems like it pretty much works", it tends to lean much more in that direction than the systems I want to create, systems about which I can say "this is known to be absolutely correct; it has been mathematically proven correct".

âoeIf we want to be serious about quality, it is time to get tired of finding bugs and start preventing their happening in the first place.ââ" Alan Page

"Beware of bugs in the above code; I have only proved it correct, not tried it." - Donald Knuth

The whole 'provably correct code' disappeared from reality as soon as I was half a step beyond academia.

I think I get his sentiment though, AI isn't programming so much as it is a data scientist thing. This is one of the interesting challenges as a technology, the vast majority of folks having deep engagement with the technology are not programmers, but currently the tools require a bit of programmer sensibility

>The whole 'provably correct code' disappeared from reality as soon as I was half a step beyond academia.

It did at one point. Maybe around 1988 or so. In the 1970s programmers were people with degrees in math, so there was a lot more correctness. As math majors, they had done plenty of mathematical proofs, so the idea of knowing that you're getting the right answer made sense to my mom's generation.

We've had a phase of "sloppy" programming for a while now, but over that time our tools have improved im

I went off on a tangent at the end there.Obviously code review doesn't guarantee or prove anything.

Other techniques CAN guarantee, or prove, certain things about the code, and it doesn't have to be time-consuming or expensive. Heck just using a strongly typed language guarantees certain things that aren't guaranteed in languages without strong typing.

They were researching "formal verification methods" in the 1990's. Using techniques like temporal logic, and automated deduction engines, they could formally verify that a CPU would be correct in all state transitions. There wouldn't be a point where an instruction could return in the wrong security ring.

I have hope that better and better tools and processes will be developed, and I'd like to help develop them. So far I've started by applying practices such as code review in organizations that didn't previously do it. We've found that code review / peer review reduces bugs enough to make it worthwhile.

This stands out in this day and age. I'm glad you've successfully introduced peer review, but I haven't heard of a shop in the last decade that doesn't implement peer review. Sounds like a change at the top is needed if you're at a shop that far behind.

There has in fact been change at the top. It was a tiny company. About a year before I joined they had one "programmer" who wrote all the code. He wasn't trained as a programmer. A family started the business together. The brother who was "good at computers" did all the code. Since then, it's been bought by a larger company with more mature processes, but headquarters still mostly leaves us alone and let's us do things our own way.

In the last two years I've implemented code review, introduced test scripts,

What other practices have you seen used a lot, practical processes which really provide clear value?

Measure key metrics like cycle time, quality (bugs), code coverage, test stability, and escapes. Review weekly as a team, however small that may be. You generally don't optimize or improve what isn't measured and regularly reviewed. Release small changes often rather than big changes less frequently, and automate as much of the release process as possible.

The concept of bugs in 'proven correct' functions doesn't make sense. If a function is proven correct, it is correct or a mistake was made in the proof so it really isn't proven correct.. There's a relatively small domain of functions that are useful and can be proven correct, hence why proving a function correct in practice doesn't make sense in the real world, but it can during a college curriculum.

The concept of a bug in code that you can be 100% certain that only impact a single function is also not r

> Have a job paying 60k, save 30k, and for each day you work you earn one day of retirement.

That's true! It's something I'm working on.

> rather than spending $$$ going back to school in the hopes of a higher paying job.

After the tax credit, my masters from Georgia Tech will only cost me about $4,500. Maybe less if I can get my employer to pitch in or something. Conservatively, my masters should bump my income by *at least* $5K / year, so it'll pay for itself the first year. After that, it's an extr

I just noticed a typo in what I wrote. My bachelors was INexpensive, not expensive. It should read:--My bachelors was also an inexpensive online program offered by a respectable university. The degree program increased my income enough to pay for the school even BEFORE I graduated.--

Most of that machine-learning stuff is really image processing. The research papers were going as far as gradient aligned anisotropic sampling filters before they suddenly jumped into neural networks and machine learning. That stuff is/was necessary for the movie production industry because they normally hired qualified animators to spend their days airbrushing out wires and props, doing lip-sync, and fixing just about anything else. Even car driving is basically matching what the sensors detect with the co

Just comparing it with my undergrad curriculum, which made sure that at least half of my classes were NOT related to my major, I'd say this gives a solid foundation. I would give some more stats courses beyond regression and intro do probability, though.

I have a Masters in CS with a focus on AI, graduated last year. There hasn't been "extraordinary technical breakthroughs in AI". There has been the slow incremental algorithm improvements as is natural and a large increase in hardware capacity. AI isn't advancing itself. Hardware is doing all the primary advancement and farming out tasks to the general public to generate your massive training set is the sourc

Imagine if some hip college started offering a degree in 3-D printing 5years ago and you invested the time and money to get one. Where would you be now? And don't forget, 3-D printing was just as big back then as AI is now, it was going to fundamentally change the world in ways no one could've even dream of.

Learn to with AI as a marketing term, as opposed to a technical term. Under this definition, anything that performs human-like tasks is AI. No question this definition is broad, clumsy, and fails to capture various forms AI that are wildly different. That said, the term does not belong to the tech community anymore. It's part of the mainstream discourse.The word isn't defined by engineers, but by business people.

It does seem like Carnegie Mellon is jumping on the AI hype bandwagon, but they have such a good program that I would be interested in riding along too. No one graduating from Carnegie Mellon CS is dumb. They rightly or wrongly get extra leeway to try out unconventional areas of study. Can't see that as a bad thing.

Boston University already requires all their engineers to take a course in data science (http://www.bu.edu/today/2018/new-eng-curriculum-requires-data-science/). This makes more sense than an entire degree in the subject.

I took ECE at Cornell University. Of course everyone talked about AI, but it was always an application of disciplines. Taking an AI major is not that different from taking a self-driving car major... relevant at this very moment, but not much beyond.