I really find it hard to believe that work in these areas will ever be more than 1-10% of all IT work.

It's the same old story of keeping IT workers off balance and in their place.

"You're not good enough. Do you have any data analytics experience? No? Didn't think so."

BTW, Udacity's AI course this fall led by Peter Norvig and Sebastian Thrun has signed up 80,000+ people. AI might be interesting, but a huge waste of time in terms of it being an employable skill for the average person.

I really find it hard to believe that work in these areas will ever be more than 1-10% of all IT work.

It's the same old story of keeping IT workers off balance and in their place.

"You're not good enough. Do you have any data analytics experience? No? Didn't think so."

BTW, Udacity's AI course this fall led by Peter Norvig and Sebastian Thrun has signed up 80,000+ people. AI might be interesting, but a huge waste of time in terms of it being an employable skill for the average person.

This is a really interesting opinion.

I felt exactly the same about Arpanet and TCP/IP in the mid-1980s when my DoD employer was hiring some specialists to implement a networking stack in our system. I felt it was too distant to be relevant in any way conceivable. I stuck with my character mode RS232/RS422 protocols where real men coded and sheep ran scared.

Less than about 7 years later (1993ish) the first browsers were being built at CERN and you could get TCP/IP over dial up from anywhere if you paid a subscription. Just a couple years later it's a gold rush.

So I think at least one of those technologies has the potential to be a game changer and profitable for an early adopter.

But the REAL problem at our level is which one? These f***ing bastards aren't going to tell you! They may not even know. Or they do but they want to kill off thousands of careers of underlings in the process of migration so they won't be honest.

Where I agree with you is that for the IT worker it's impossible to know where the best career position is, so the churn of multiple new technologies is a smokescreen that makes it impossible to build or plan a career.

Logged

Gornix is protected by the GPL. *

* Gorn Public License. Duplication by inferior sentient species prohibited.

Note that AI has been around for a very long time. It was one of the specializations in the program where I got my MSCS degree in 1996. That was one of the earlier AI-Is-The-Next-Big-Thing hypes.

As far as process control goes and robotics, I think that will be a handled by industrial engineers and people already steeped in manufacturing. There will be some new demand for programmers.

The biggest growth in automation will be to automate software development.

Big data analytics will be helpful in learning consumer behavior. But it has the problem -- at least for machine learning -- of not being able to explain why something is as it is. For that reason, it will never fly in decision support areas such as medicine where you need to explain why, not just that something matched a pattern. Humans will use these DSSes as tools though. And there will be some new demand for programmers to build the tools.

Watson is in the health care decision support system business now, and we'll see how it goes. But I don't think think Watson and its ilk will need more than maybe 100s or 1,000s of engineers and they will all be top-tier school types. Including IIT (Indian Institute of Technology) types.

But I've been very wrong about guessing technology before. I turned down a Unix admin job in 1987 because I wanted to stick with VAX system administration. Wanted no part of this upstart Unix thing.

I understand your rationales but you're making the same overly-conservative individual techie thinking on his own judgement that I did in my career. I say you and I and others at our level are incapable of predicting future trends and growth opportunities.

In exactly the same manner as you, I rationalized in the mid-1980s when I could have had some OTJ experience with ethernet, modern networking stacks, and TCP/IP, that computer to computer communications had been around FOREVER. Arpanet was a closed DoD system with no possibility of commercialization.

Therefore TCP/IP was NO BIG DEAL, it succeeded things that performed more or less the same tasks.

I was utterly wrong to disregard this buzzword that everyone at work was yapping about. TCP/IP started to be applied in ways that create commercial opportunities only a few years later. I had plenty of warning.

There was really no trade news to indicate to me that I should jump that bandwagon at that time, either.

I just don't think you or I have a handle on these things and where they will lead which is more important than where they are now.

And it may not be any of these things either.

« Last Edit: September 02, 2017, 09:33:42 am by The Gorn »

Logged

Gornix is protected by the GPL. *

* Gorn Public License. Duplication by inferior sentient species prohibited.

A technology person can do a better job of forecasting than we did by spending more time doing research into emerging technologies. It was difficult to do prior to the WWW, but now there's no excuse for not spending at least 10% of one's time doing market and technology research.

Might take another 10-20% of one's time to experiment with the technologies, which doesn't leave much time for actually doing your current job.

Easy to say, EXTREMELY hard to do in a truly productive way. I'm not denigrating you, but reflecting on the same issue that has been in my thoughts - "how to avoid not having the right tech under my belt" - answer: absolutely no way possible unless you have personal connections to big league players. One way of doing this: knowing venture capitalists socially who are willing to share ideas.

Your avoidance of Unix example is a PERFECT illustration of this.

One doesn't understand something thoroughly until it stops being an abstraction to you. Studying tech trends in isolation is too abstract to make any difference in one's insight.

Logged

Gornix is protected by the GPL. *

* Gorn Public License. Duplication by inferior sentient species prohibited.

AI and Robotics are the hottest things to get into now, IMO. That is the future of technology. If I was a kid first starting out, thats what I would study. What exciting up and coming fields with all kinds of possibilities for future jobs!

Right now, there are more upper level jobs available in those areas, but once it gets going there will be millions of jobs available. It will be like the birth of the personal PC era back in the 80's. Who wouldn't want their own robot? I know I do.

I always loved the idea of Rosey the Robot from the Jetsons becoming a reality and think we will get to see it in our lifetimes. How cool is that! I know you laugh, but she had personality and was very helpful. I'd like that in my home, especially now that I'm getting older.

Robotics/AI is the new Hardware/Software field from the past, in the '80s. I remember back then in the PC world we had MS DOS. Loved it and we had Wordperfect. It was great and you could make it greater by using Macros! We had Lotus 123 and Harvard Graphics. I thought they would last forever. I was a Novell network admin. I think most people today don't know what that is. It was all wasted education for me, but not at the time. I guess not long term focused.

I don't think the Indians realized they had a shelf life. It was just like what they did to Americans. Maybe they're shocked because they thought they were better than us, but in reality they're expendable just like the rest of us. We're just ordinary people trying to make our way in life and maybe be preferred for employment one day in our own country.

I'm totally NOT sold on the career potential of pursuing AI or robotics or other leading edge technologies for two basic reasons:

#1. It is NOT a career building decision. It's serfdom. No matter what technical knowledge you have, you ALWAYS drift toward being a serf. This is because work culture and business culture have both deteriorated substantially in the last 30 years.

#2: "AI" and "robotics" are quite broad. You can't compare them to PCs because with the PC revolution of the 80s there was enough specialization of platforms (basically MS-DOS) that it was fairly straightforward to find a sub niche to exploit. With either AI or robotics where is the bubble of dependable market demand that will allow you to learn something reasonably stable that you can find work in? Either AI or robotics encompasses huge broad swaths of buzzwords and technology.

Logged

Gornix is protected by the GPL. *

* Gorn Public License. Duplication by inferior sentient species prohibited.

There are lots of hot areas to pursue, no doubt, but the sexier stuff is AI.

It's probably sexy, but I've seen this AI hype before in 1992-1996. It was the hot thing to do then. What's changed? Why did it go dark for the last 20 years and then emerge locust-like to grab everyone's attention?

Udacity's AI course this fall led by Peter Norvig and Sebastian Thrun has signed up 80,000+ people so anyone who wants to specialize in this is going to have a lot of company.

AI has been around a long time, but the timing wasn't right for it in the '90s. That timing is now, IMO. There is no robust market in the US yet, its an emerging field that will boom. If you study AI/Robotics now you will be in on the ground floor of something as big as the PC movement, if not bigger.

AI/Robotics isn't about any specific software, it's about integrating robots into our society. There will be plenty of different jobs created to deal with the robots. I see this as a great opportunity to get in on the ground floor. If you do that you can be more competitive in the years to come.

There is a lot of competition in every field, I wouldn't let that stand in my way. Gorn, serfdom is good if you're making money and treated ok at a job. Not everyone wants anything more than that. I always had good jobs until I got older. After that, I got to pick from the trash.