This video will cover computer science careers and as well as different areas within computer science you can dive into. Computer science is a very broad and diverse field that includes software development, cryptography, cyber security, computer graphics, bioinformatics, and much more.
Although many people imagine a software developer when it comes to computer science, there are many more job titles you can have including security analyst, web developer, network systems administrator, etc.
Many of the jobs listed in this video have the highest employment and growth rate of most jobs on the bureau of labor statistics. This major and career path isn't necessarily for everyone, but it definitely is a good choice.
Lastly for many of the careers discussed in this video, you don't HAVE to have a degree in computer science to get into them. For some career paths you can have a degree in computer engineering, software engineering, information systems, and more. There are even people who are self taught and have landed software developer jobs. Be sure to keep an open mind, but hopefully this provides some insight.
Artificial IntelligenceVideo (Part 1): https://www.youtube.com/watch?v=JRHAM1nAuD4
***************************************************
► For more information on math, science, and engineering majors, check us out at https://majorprep.com
***************************************************
► Patreon: https://patreon.com/majorprep
► PayPal: https://www.paypal.me/majorprep
► Facebook: https://facebook.com/majorprep

published:21 Aug 2017

views:34284

published:24 Feb 2010

views:443

We spoke to Marwa Salayma about her journey from Palestine to studying for a PhD at EdinburghNapier.
Encouraged by her parents to pursue a career in computing, she discusses the challenges and encouragement she has encountered along the way and explains why it isn't just for men.
Find out more on our website:
www.napier.ac.uk/computing

published:14 Sep 2017

views:880

The field of computer science summarised. Learn more at this video's sponsor https://brilliant.org/dos
Computer science is the subject that studies what computers can do and investigates the best ways you can solve the problems of the world with them. It is a huge field overlapping pure mathematics, engineering and many other scientific disciplines. In this video I summarise as much of the subject as I can and show how the areas are related to each other.
A couple of notes on this video:
1. Some people have commented that I should have included computer security alongside hacking, and I completely agree, that was an oversight on my part. Apologies to all the computer security professionals, and thanks for all the hard work!
2. I also failed to mention interpreters alongside compilers in the complier section. Again, I’m kicking myself because of course this is an important concept for people to hear about. Also the layers of languages being compiled to other languages is overly convoluted, in practice it is more simple than this. I guess I should have picked one simple example.
3. NP-complete problems are possible to solve, they just become very difficult to solve very quickly as they get bigger. When I said NP-complete and then "impossible to solve", I meant that the large NP-complete problems that industry is interested in solving were thought to be practically impossible to solve.
You can buy this poster here: https://www.redbubble.com/people/dominicwalliman/works/27929629-map-of-computer-science?p=poster&finish=semi_gloss&size=small
Get all my other posters here: https://www.redbubble.com/people/dominicwalliman
And free downloadable versions of this and the other posters here. If you want to print them out for educational purposes please do! https://www.flickr.com/photos/95869671@N08/
Thanks so much to my supporters on Patreon. If you enjoy my videos and would like to help me make more this is the best way and I appreciate it very much. https://www.patreon.com/domainofscience
I also write a series of children’s science books call ProfessorAstro Cat, these links are to the publisher, but they are available in all good bookshops around the world in 18 languages and counting:
Frontiers of Space (age 7+): http://nobrow.net/shop/professor-astro-cats-frontiers-of-space/
AtomicAdventure (age 7+): http://nobrow.net/shop/professor-astro-cats-atomic-adventure/
IntergalacticActivityBook (age 7+): http://nobrow.net/shop/professor-astro-cats-intergalactic-activity-book/
Solar System Book (age 3+, available in UK now, and rest of world in spring 2018): http://nobrow.net/shop/professor-astro-cats-solar-system/?
Solar System App: http://www.minilabstudios.com/apps/professor-astro-cats-solar-system/
And the new Professor Astro Cat App: https://itunes.apple.com/us/app/galactic-genius-with-astro-cat/id1212841840?mt=8
Find me on twitter, Instagram, and my website:
http://dominicwalliman.com
https://twitter.com/DominicWalliman
https://www.instagram.com/dominicwalliman
https://www.facebook.com/dominicwalliman

Check out http://www.engineer4free.com for more free engineering tutorials and math lessons!
Project Management Tutorial: Use forward and backward pass to determine project duration and critical path.

published:28 Dec 2014

views:386142

Surprisingly enough, one of the most challenging parts of computer graphics is simulating light transport. In particular, mathematical tricks are required to allow for fast and realist imaging. The video features WenzelJakob, Assistant Professor of the IC School at EPFL.
https://people.epfl.ch/wenzel.jakob

published:17 May 2017

views:425

There is much speculation by some, as to how the flight computer aboard the Apollo missions managed to get men to the moon when it had just a tiny fraction of the computing power of something like a modern smartphone.
Patreon : https://www.patreon.com/curiousdroid
You can now translate this and other curious droid videos, see my video about it here https://www.youtube.com/watch?v=xLPVgIytKyg
But this is quite misleading as there was not one solitary computer controlling the Apollo craft, there were 4 computers and no fancy touch screens, GUI’s or other things in a typical computer of today to waste resources on.
The first of the 4 computers was the Saturn Launch Vehicle Digital Computer or LVDC, this got the rocket from the launch pad to earth orbit.
Then there was the Apollo Guidance Computer or AGC, this is the one which most people think of. There were in fact, two of them, one in the Command Module to get from the earth orbit to the moon and back again. The second was in the lunar module that would control the landing and then the ascent back to the command module and docking.
The fourth computer was one which was never used on any mission because it would control an emergency abort and ascent should something happen during the descent to the moons surface like the landing computer failing or they ran out of fuel.
The ApolloGuidanceComputer wasn’t as dumb as many make it out to be. As time when by in Apollo’s development, the tasks that it was meant to do increased in both number and sophistication, this in turn created ever more issues with the limited resources available .
One of the biggest problems was the limited amount of memory due to the technological limitations of the time, this meant that the programmers had to make use of every single byte available.
The AGC also had a unique operating system. Systems like UNIX, Linex, Windows and Apple iOS are in control and share time out to the programs. In the AGC, the programs controlled how much time they got depending on how important they were. So that in the case of an emergency, the highest priority programs would get most of the time and non-essential operations were dropped to free up resources which became the basis of mission critical system for all manned mission afterwards.
The computer had a performance somewhere around that to that of the first generation of personal computers like the Apple II, commodore 64, ZX Spectrum that would arrive 10 years later in the late 70’s.
It had 2k of RAM and 36k of fixed storage magnetic rope core memory, which was woven by hand and took months to assemble, so any software bugs were literally woven into the system.
A comparison between the Apollo Guidance Computer and say an iPhone 6 is tricky because the AGC was not a general purpose computer. It was built for a very specific task, had a unique operating system and with the 48-year gap in the technologies used, we can only really get very rough estimates.
The Apple iPhone 6 uses the ARM A8 processor which has about 1.6 billion transistors in it, the AGC had just 12,300. The iPhone 6 has 1Gb of RAM, about 488,000 times the AGC and in this one, 128Gb of non-volatile storage or about 3.5 million times the AGC.
As for performance, the iPhone 6 is somewhere between maybe 4 and 30 Million times faster than the AGC depending on what type of calculations are being done and if you include the iPhone’s GPU it would be even more.
So, if you had to fly back to the moon in an Apollo craft and given the choice, would you trust your life to a couple of iPhones in place of the AGC’s?, because you would actually have more computing power in just one of them than the whole of NASA had during all the Apollo missions.......
Title: Adam Are You Free?
Author: P CIIISource: www.pipechoir.com
Nightingale sounds from Gerry Gutteridge flic.kr/ps/Mk2zU
License: Creative Commons Attribution4.0

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing business unit, which includes all hardware and software product lines, third-party enablement, and marketing activities for GPU computing. Buck joined NVIDIA in 2004 and created CUDA, which remains the established leading platform for accelerated-based parallel computing. Before joining NVIDIA, he was the development lead on Brook, which was the forerunner to generalized computing on GPUs. Buck holds a Ph.D. in computer science from Stanford University and Bachelor of Science in computer science from Princeton University.
Learn more: http://nvidia.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

History

Around 1970, Unix introduced the forward slash character ("/") as its directory separator. In 1981, when the original version of Microsoft DOS (MS-DOS 1.0) was released, Microsoft DOS did not support directories. A major portion of the utilities packaged with DOS came from IBM. The command line prompts of these IBM-written utilities made use of the forward slash character as a "switch" which is still existent today (as in dir /w tells the dir command to run with the wide list format option). However, on Unix the dash ("-") character is used for switches. When directory support was introduced in MS-DOS 2.0, IBM desired to keep compatibility with the original DOS utilities, and a host of other programs that had been written to use the forward slash as a switching character. Since the forward slash character already served as a switching utility, Microsoft chose the back slash character ("\") which character-wise looks very similar to the forward slash character ("/") to indicate directory separation.

Cloud computing

Cloud computing, also known as 'on-demand computing', is a kind of Internet-based computing, where shared resources, data and information are provided to computers and other devices on-demand. It is a model for enabling ubiquitous, on-demand access to a shared pool of configurable computing resources. Cloud computing and storage solutions provide users and enterprises with various capabilities to store and process their data in third-party data centers. It relies on sharing of resources to achieve coherence and economies of scale, similar to a utility (like the electricity grid) over a network. At the foundation of cloud computing is the broader concept of converged infrastructure and shared services.

Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort.

Computer science

Computer science is the scientific and practical approach to computation and its applications. It is the systematic study of the feasibility, structure, expression, and mechanization of the methodical procedures (or algorithms) that underlie the acquisition, representation, processing, storage, communication of, and access to information. An alternate, more succinct definition of computer science is the study of automating algorithmic processes that scale. A computer scientist specializes in the theory of computation and the design of computational systems.

Computer Science Careers and Subfields

This video will cover computer science careers and as well as different areas within computer science you can dive into. Computer science is a very broad and diverse field that includes software development, cryptography, cyber security, computer graphics, bioinformatics, and much more.
Although many people imagine a software developer when it comes to computer science, there are many more job titles you can have including security analyst, web developer, network systems administrator, etc.
Many of the jobs listed in this video have the highest employment and growth rate of most jobs on the bureau of labor statistics. This major and career path isn't necessarily for everyone, but it definitely is a good choice.
Lastly for many of the careers discussed in this video, you don't HAVE to have a degree in computer science to get into them. For some career paths you can have a degree in computer engineering, software engineering, information systems, and more. There are even people who are self taught and have landed software developer jobs. Be sure to keep an open mind, but hopefully this provides some insight.
Artificial IntelligenceVideo (Part 1): https://www.youtube.com/watch?v=JRHAM1nAuD4
***************************************************
► For more information on math, science, and engineering majors, check us out at https://majorprep.com
***************************************************
► Patreon: https://patreon.com/majorprep
► PayPal: https://www.paypal.me/majorprep
► Facebook: https://facebook.com/majorprep

3:58

Understanding File Directories and Paths

Understanding File Directories and Paths

Understanding File Directories and Paths

3:08

Forging a path into Computing

Forging a path into Computing

Forging a path into Computing

We spoke to Marwa Salayma about her journey from Palestine to studying for a PhD at EdinburghNapier.
Encouraged by her parents to pursue a career in computing, she discusses the challenges and encouragement she has encountered along the way and explains why it isn't just for men.
Find out more on our website:
www.napier.ac.uk/computing

10:58

Map of Computer Science

Map of Computer Science

Map of Computer Science

The field of computer science summarised. Learn more at this video's sponsor https://brilliant.org/dos
Computer science is the subject that studies what computers can do and investigates the best ways you can solve the problems of the world with them. It is a huge field overlapping pure mathematics, engineering and many other scientific disciplines. In this video I summarise as much of the subject as I can and show how the areas are related to each other.
A couple of notes on this video:
1. Some people have commented that I should have included computer security alongside hacking, and I completely agree, that was an oversight on my part. Apologies to all the computer security professionals, and thanks for all the hard work!
2. I also failed to mention interpreters alongside compilers in the complier section. Again, I’m kicking myself because of course this is an important concept for people to hear about. Also the layers of languages being compiled to other languages is overly convoluted, in practice it is more simple than this. I guess I should have picked one simple example.
3. NP-complete problems are possible to solve, they just become very difficult to solve very quickly as they get bigger. When I said NP-complete and then "impossible to solve", I meant that the large NP-complete problems that industry is interested in solving were thought to be practically impossible to solve.
You can buy this poster here: https://www.redbubble.com/people/dominicwalliman/works/27929629-map-of-computer-science?p=poster&finish=semi_gloss&size=small
Get all my other posters here: https://www.redbubble.com/people/dominicwalliman
And free downloadable versions of this and the other posters here. If you want to print them out for educational purposes please do! https://www.flickr.com/photos/95869671@N08/
Thanks so much to my supporters on Patreon. If you enjoy my videos and would like to help me make more this is the best way and I appreciate it very much. https://www.patreon.com/domainofscience
I also write a series of children’s science books call ProfessorAstro Cat, these links are to the publisher, but they are available in all good bookshops around the world in 18 languages and counting:
Frontiers of Space (age 7+): http://nobrow.net/shop/professor-astro-cats-frontiers-of-space/
AtomicAdventure (age 7+): http://nobrow.net/shop/professor-astro-cats-atomic-adventure/
IntergalacticActivityBook (age 7+): http://nobrow.net/shop/professor-astro-cats-intergalactic-activity-book/
Solar System Book (age 3+, available in UK now, and rest of world in spring 2018): http://nobrow.net/shop/professor-astro-cats-solar-system/?
Solar System App: http://www.minilabstudios.com/apps/professor-astro-cats-solar-system/
And the new Professor Astro Cat App: https://itunes.apple.com/us/app/galactic-genius-with-astro-cat/id1212841840?mt=8
Find me on twitter, Instagram, and my website:
http://dominicwalliman.com
https://twitter.com/DominicWalliman
https://www.instagram.com/dominicwalliman
https://www.facebook.com/dominicwalliman

Shortest Path using Dijkstra's Algorithm

Use forward and backward pass to determine project duration and critical path

Use forward and backward pass to determine project duration and critical path

Use forward and backward pass to determine project duration and critical path

Check out http://www.engineer4free.com for more free engineering tutorials and math lessons!
Project Management Tutorial: Use forward and backward pass to determine project duration and critical path.

10:47

Computing Light Paths (ft. Wenzel Jakob)

Computing Light Paths (ft. Wenzel Jakob)

Computing Light Paths (ft. Wenzel Jakob)

Surprisingly enough, one of the most challenging parts of computer graphics is simulating light transport. In particular, mathematical tricks are required to allow for fast and realist imaging. The video features WenzelJakob, Assistant Professor of the IC School at EPFL.
https://people.epfl.ch/wenzel.jakob

9:05

How did the Apollo flight computers get men to the moon and back ?

How did the Apollo flight computers get men to the moon and back ?

How did the Apollo flight computers get men to the moon and back ?

There is much speculation by some, as to how the flight computer aboard the Apollo missions managed to get men to the moon when it had just a tiny fraction of the computing power of something like a modern smartphone.
Patreon : https://www.patreon.com/curiousdroid
You can now translate this and other curious droid videos, see my video about it here https://www.youtube.com/watch?v=xLPVgIytKyg
But this is quite misleading as there was not one solitary computer controlling the Apollo craft, there were 4 computers and no fancy touch screens, GUI’s or other things in a typical computer of today to waste resources on.
The first of the 4 computers was the Saturn Launch Vehicle Digital Computer or LVDC, this got the rocket from the launch pad to earth orbit.
Then there was the Apollo Guidance Computer or AGC, this is the one which most people think of. There were in fact, two of them, one in the Command Module to get from the earth orbit to the moon and back again. The second was in the lunar module that would control the landing and then the ascent back to the command module and docking.
The fourth computer was one which was never used on any mission because it would control an emergency abort and ascent should something happen during the descent to the moons surface like the landing computer failing or they ran out of fuel.
The ApolloGuidanceComputer wasn’t as dumb as many make it out to be. As time when by in Apollo’s development, the tasks that it was meant to do increased in both number and sophistication, this in turn created ever more issues with the limited resources available .
One of the biggest problems was the limited amount of memory due to the technological limitations of the time, this meant that the programmers had to make use of every single byte available.
The AGC also had a unique operating system. Systems like UNIX, Linex, Windows and Apple iOS are in control and share time out to the programs. In the AGC, the programs controlled how much time they got depending on how important they were. So that in the case of an emergency, the highest priority programs would get most of the time and non-essential operations were dropped to free up resources which became the basis of mission critical system for all manned mission afterwards.
The computer had a performance somewhere around that to that of the first generation of personal computers like the Apple II, commodore 64, ZX Spectrum that would arrive 10 years later in the late 70’s.
It had 2k of RAM and 36k of fixed storage magnetic rope core memory, which was woven by hand and took months to assemble, so any software bugs were literally woven into the system.
A comparison between the Apollo Guidance Computer and say an iPhone 6 is tricky because the AGC was not a general purpose computer. It was built for a very specific task, had a unique operating system and with the 48-year gap in the technologies used, we can only really get very rough estimates.
The Apple iPhone 6 uses the ARM A8 processor which has about 1.6 billion transistors in it, the AGC had just 12,300. The iPhone 6 has 1Gb of RAM, about 488,000 times the AGC and in this one, 128Gb of non-volatile storage or about 3.5 million times the AGC.
As for performance, the iPhone 6 is somewhere between maybe 4 and 30 Million times faster than the AGC depending on what type of calculations are being done and if you include the iPhone’s GPU it would be even more.
So, if you had to fly back to the moon in an Apollo craft and given the choice, would you trust your life to a couple of iPhones in place of the AGC’s?, because you would actually have more computing power in just one of them than the whole of NASA had during all the Apollo missions.......
Title: Adam Are You Free?
Author: P CIIISource: www.pipechoir.com
Nightingale sounds from Gerry Gutteridge flic.kr/ps/Mk2zU
License: Creative Commons Attribution4.0

Accelerated Computing: The Path Forward

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing business unit, which includes all hardware and software product lines, third-party enablement, and marketing activities for GPU computing. Buck joined NVIDIA in 2004 and created CUDA, which remains the established leading platform for accelerated-based parallel computing. Before joining NVIDIA, he was the development lead on Brook, which was the forerunner to generalized computing on GPUs. Buck holds a Ph.D. in computer science from Stanford University and Bachelor of Science in computer science from Princeton University.
Learn more: http://nvidia.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

11:01

History of Computing - The Path to Modern Computers

History of Computing - The Path to Modern Computers

History of Computing - The Path to Modern Computers

GoogleIT SupportProfessionalCertificate
https://www.coursera.org/specializations/google-it-support
This six-course certificate, developed exclusively by Google, includes innovative curriculum designed to prepare you for an entry-level role in IT support. A job in IT can mean in-person or remote help desk work, either in a small business or at a global company, like Google. Whether you’ve been tinkering with IT or are completely new to the field, you’ve come to the right place.
If you’re looking for a job, upon completion of the certificate, you can share your information with top employers, like Bank of America, Walmart, Sprint, GE Digital, PNCBank, Infosys, TEKsystems, UPMC, and, of course, Google.
Through a dynamic mix of video lectures, quizzes, and hand-on labs and widgets, this program will introduce you to troubleshooting and customer service, networking, operating systems, system administration, automation, and security.
Along the way, you’ll hear from Googlers with unique backgrounds and perspectives, whose own foundation in IT support served as a jumping off point for their careers. They’re excited to go on this journey with you, as you invest in your future by achieving an end-of-program certificate.
Course 1 - IT Technical Support Fundamentals
In this course, you’ll be introduced to the world of Information Technology, or IT. This course is the first of a series that aims to prepare you for a role as an entry-level IT Support Specialist. You’ll learn about the different facets of Information Technology, like computer hardware, the Internet, computer software, and job-related skills. You’ll also learn about the history of computers, and the pioneers who shaped the world of computing that we know today. This course covers a wide variety of topics in IT that are designed to give you an overview of what’s to come in this IT Support Professional Certificate. By the end of this course, you’ll be able to: - understand how the binary system works. - assemble a computer from scratch. - choose and install an operating system on a computer. - understand what the Internet is, how it works, and the impact it has in the modern world. - learn how applications are created and how they work under the hood of our computer. - utilize common problem-solving methodologies and soft skills in an Information Technology setting.
Who is this class for: This program is intended for beginners who are interested in developing the skills necessary to perform entry-level IT support. No pre-requisite knowledge is required. However, if you do have some familiarity with IT, you can skip through

3:21

Career Opportunities in Cloud Computing

Career Opportunities in Cloud Computing

Career Opportunities in Cloud Computing

This course describes 6 different positions in the cloud computing industry, all of which are in high demand as established companies move to cloud technologies, companies expand IT departments, and new companies form. There’s great opportunity for fulfilling work with great pay. Overall it’s a great time to be in the cloud computing industry (which is rapidly becoming the IT industry).
To enjoy the entire learning path: https://goo.gl/4VajyB
You can also have access to the entire CloudAcademy content library with a 7-day free trial at https://goo.gl/8lu43D.

This presentation was recorded at GOTOAarhus2013http://gotocon.com
Jeff Hawkins - BrainInspired Computing
ABSTRACT
Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe recent advances in neocortical theory, including why the brain uses sparse distributed representations and how the brain makes predictions from high velocity sensory data streams.
I will demonstrate a product called Grok, that uses a detailed model of neocortical memory to act on machine generated data and how developers can contribute to the development of intelligent machines via the NuPIC open source project (www.numenta.org).
https://twitter.com/gotocon
https://www.facebook.com/GOTOConference
http://gotocon.com

Computing a path to more profits with Bell Labs self-tuned adaptive routing

Computing a path to more profits with Bell Labs self-tuned adaptive routing

Computing a path to more profits with Bell Labs self-tuned adaptive routing

CarrierSDN platforms such as the NokiaNetwork Services Platform enable new and exciting possibilities for wide area network optimization.
This video explains the application of the Self-tuned Adaptive RoutingAlgorithm developed by Nokia Bell Labs for centralized path computation in IP/MPLS networks.
For more information, please visit our IP/optical integration solution page https://networks.nokia.com/solutions/ip-optical-integration

10:23

Computer Science Vs. Software Development Degree

Computer Science Vs. Software Development Degree

Computer Science Vs. Software Development Degree

► FREE COURSE - 5 LearningMistakesSoftware Developers Make ◄
https://simpleprogrammer.com/learn-faster
SUBSCRIBE TO THIS CHANNEL: vid.io/xokz
Interview With Matt From Engineered Truth: https://www.youtube.com/watch?v=a-Nba_akPBk&list=PLjwWT1Xy3c4Vd9P7BekMCpqmA49uqAZpx&index=6
Learn Programming By Going To College: https://simpleprogrammer.com/2016/08/15/learning-programming-by-going-to-college/
Other Important Links:
SUPPORT THIS YOUTUBE CHANNEL: vid.io/xokw
Visit: http://simpleprogrammer.com/
Computer Science Vs. Software DevelopmentDegree
So, there are a lot of bragging about what's the difference between computer science and a software development degree. A lot of people want to start a career in software development but they don't know if they should aim for a computer science degree or a software development degree.
I often receive a lot of emails from people asking me about a computer science degree. They want to become software developers but they don't know which path they should pursue. Is a computer science a good degree for becoming a software development?
What a computer science degree can offer you that a software development degree can't? What things you should get with a computer science degree that are important that you wouldn't get with a regular software development degree?
Watch this video and find out!
If you have a question, email me at john@simpleprogrammer.com
If you liked this video, share, like and, of course, subscribe!
Subscribe To My YouTube Channel: http://bit.ly/1zPTNLT
Visit Simple Programmer Website: http://simpleprogrammer.com/
Connect with me on social media:
Facebook: https://www.facebook.com/SimpleProgrammer
Twitter: https://twitter.com/jsonmez
Other Links:
Sign up for the Simple Programmer Newsletter: http://simpleprogrammer.com/email
Simple Programmer blog: http://simpleprogrammer.com/blog
Learn how to learn anything quickly: http://10stepstolearn.com
Boost your career now: http://devcareerboost.com

7:35

Is Data Science A Viable Career Path?

Is Data Science A Viable Career Path?

Is Data Science A Viable Career Path?

I am doing a bachelor of computer science and have not yet chosen a discipline to major in. My lecturer mentioned the disciplines and one grabbed my attention: Data Science. He explained how data science is the future of technology and gave some examples. Do you believe that data science is the future and can you please explain what it is in a little more detail?
-Abraham A.
Schedule a Skype Meeting with Eli: https://silicondiscourse.com
Podcasts of New Videos at SoundCloud: https://soundcloud.com/elithecomputerguy
To AskQuestionsEmail: Question@EliTheComputerGuy.com
For Classes, ClassNotes and Blog Posts:
http://www.EliTheComputerGuy.com
Visit the MainYouTube Channel at:
http://www.YouTube.com/EliTheComputerGuy
Follow us on Twitter at:
http://www.Twitter.com/EliComputerGuy
**********

Computer Science Careers and Subfields

This video will cover computer science careers and as well as different areas within computer science you can dive into. Computer science is a very broad and diverse field that includes software development, cryptography, cyber security, computer graphics, bioinformatics, and much more.
Although many people imagine a software developer when it comes to computer science, there are many more job titles you can have including security analyst, web developer, network systems administrator, etc.
Many of the jobs listed in this video have the highest employment and growth rate of most jobs on the bureau of labor statistics. This major and career path isn't necessarily for everyone, but it definitely is a good choice.
Lastly for many of the careers discussed in this video, you don't HAVE to ha...

published: 21 Aug 2017

Understanding File Directories and Paths

published: 24 Feb 2010

Forging a path into Computing

We spoke to Marwa Salayma about her journey from Palestine to studying for a PhD at EdinburghNapier.
Encouraged by her parents to pursue a career in computing, she discusses the challenges and encouragement she has encountered along the way and explains why it isn't just for men.
Find out more on our website:
www.napier.ac.uk/computing

published: 14 Sep 2017

Map of Computer Science

The field of computer science summarised. Learn more at this video's sponsor https://brilliant.org/dos
Computer science is the subject that studies what computers can do and investigates the best ways you can solve the problems of the world with them. It is a huge field overlapping pure mathematics, engineering and many other scientific disciplines. In this video I summarise as much of the subject as I can and show how the areas are related to each other.
A couple of notes on this video:
1. Some people have commented that I should have included computer security alongside hacking, and I completely agree, that was an oversight on my part. Apologies to all the computer security professionals, and thanks for all the hard work!
2. I also failed to mention interpreters alongside compilers in ...

Use forward and backward pass to determine project duration and critical path

Check out http://www.engineer4free.com for more free engineering tutorials and math lessons!
Project Management Tutorial: Use forward and backward pass to determine project duration and critical path.

published: 28 Dec 2014

Computing Light Paths (ft. Wenzel Jakob)

Surprisingly enough, one of the most challenging parts of computer graphics is simulating light transport. In particular, mathematical tricks are required to allow for fast and realist imaging. The video features WenzelJakob, Assistant Professor of the IC School at EPFL.
https://people.epfl.ch/wenzel.jakob

published: 17 May 2017

How did the Apollo flight computers get men to the moon and back ?

There is much speculation by some, as to how the flight computer aboard the Apollo missions managed to get men to the moon when it had just a tiny fraction of the computing power of something like a modern smartphone.
Patreon : https://www.patreon.com/curiousdroid
You can now translate this and other curious droid videos, see my video about it here https://www.youtube.com/watch?v=xLPVgIytKyg
But this is quite misleading as there was not one solitary computer controlling the Apollo craft, there were 4 computers and no fancy touch screens, GUI’s or other things in a typical computer of today to waste resources on.
The first of the 4 computers was the Saturn Launch Vehicle Digital Computer or LVDC, this got the rocket from the launch pad to earth orbit.
Then there was the Apollo Guidance C...

published: 11 Mar 2017

Feynman's Infinite Quantum Paths | Space Time

How to predict the path of a quantum particle. Part 3 in our Quantum Field TheorySeries.
You can further support us on Patreon at https://www.patreon.com/pbsspacetime
Get your own Space Time t­shirt at http://bit.ly/1QlzoBi
Tweet at us! @pbsspacetime
Facebook: facebook.com/pbsspacetime
Email us! pbsspacetime [at] gmail [dot] com
Comment on Reddit: http://www.reddit.com/r/pbsspacetime
Help translate our videos! https://www.youtube.com/timedtext_cs_panel?tab=2&c=UC7_gcs09iThXybpVgjHZ_7g
Previous Episode:
The First Quantum Field Theory
https://www.youtube.com/watch?v=ATcrrzJFtBY
There is a fundamental limit to the knowability of the universe. The Heisenberg uncertainty principle tells us that the more precisely we try to define one property, the less definable is its counterpart. Knowi...

published: 07 Jul 2017

LHC animation: The path of the protons

This animation shows how the Large Hadron Collider (LHC) works.
The film begins with an aerial view of CERN near Geneva, with outlines of the accelerator complex, including the underground Large Hadron Collider (LHC), 27-km in circumference. The positions of the four largest LHC experiments, ALICE, ATLAS, CMS and LHCb are revealed before we see protons travelling around the LHC ring.
The proton source is a simple bottle of hydrogen gas. An electric field is used to strip hydrogen atoms of their electrons to yield protons. Linac 2, the first accelerator in the chain, accelerates the protons to the energy of 50 MeV. The beam is then injected into the Proton Synchrotron Booster (PSB), which accelerates the protons to 1.4 GeV, followed by the Proton Synchrotron (PS), which pushes the beam t...

published: 03 Jun 2015

Accelerated Computing: The Path Forward

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing business unit, which includes all hardware and software product lines, third-party enablement, and marketing activities for GPU computing. Buck joined NVIDIA in 2004 and created CUDA, which remains the established leading platform for accelerated-based parallel computing. Before joining NVIDIA, he was the development lead on Brook, which was the forerunner to generalized computing on GPUs. Buck holds a Ph.D. in computer science from Stanford University and Bachelor of Science in computer science from Princeton University.
Learn more: http://nvidia.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

published: 18 Jan 2018

History of Computing - The Path to Modern Computers

GoogleIT SupportProfessionalCertificate
https://www.coursera.org/specializations/google-it-support
This six-course certificate, developed exclusively by Google, includes innovative curriculum designed to prepare you for an entry-level role in IT support. A job in IT can mean in-person or remote help desk work, either in a small business or at a global company, like Google. Whether you’ve been tinkering with IT or are completely new to the field, you’ve come to the right place.
If you’re looking for a job, upon completion of the certificate, you can share your information with top employers, like Bank of America, Walmart, Sprint, GE Digital, PNCBank, Infosys, TEKsystems, UPMC, and, of course, Google.
Through a dynamic mix of video lectures, quizzes, and hand-on labs and widgets, this ...

published: 15 Feb 2018

Career Opportunities in Cloud Computing

This course describes 6 different positions in the cloud computing industry, all of which are in high demand as established companies move to cloud technologies, companies expand IT departments, and new companies form. There’s great opportunity for fulfilling work with great pay. Overall it’s a great time to be in the cloud computing industry (which is rapidly becoming the IT industry).
To enjoy the entire learning path: https://goo.gl/4VajyB
You can also have access to the entire CloudAcademy content library with a 7-day free trial at https://goo.gl/8lu43D.

This presentation was recorded at GOTOAarhus2013http://gotocon.com
Jeff Hawkins - BrainInspired Computing
ABSTRACT
Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe recent advances in neocortical theory, including why the brain uses sparse distributed representations and how the brain makes predictions from high velocity sensory data streams.
I will demonstrate a product called Grok, that uses a detailed model of neocortical memory to act on machine generated data and how developers can contribute to the development of intelligent machines via the NuPIC open source project (www.numenta.org).
https://twitter.com/gotocon
https://www.facebook.com/GOTOConference
http://gotoco...

Computing a path to more profits with Bell Labs self-tuned adaptive routing

CarrierSDN platforms such as the NokiaNetwork Services Platform enable new and exciting possibilities for wide area network optimization.
This video explains the application of the Self-tuned Adaptive RoutingAlgorithm developed by Nokia Bell Labs for centralized path computation in IP/MPLS networks.
For more information, please visit our IP/optical integration solution page https://networks.nokia.com/solutions/ip-optical-integration

published: 06 Dec 2016

Computer Science Vs. Software Development Degree

► FREE COURSE - 5 LearningMistakesSoftware Developers Make ◄
https://simpleprogrammer.com/learn-faster
SUBSCRIBE TO THIS CHANNEL: vid.io/xokz
Interview With Matt From Engineered Truth: https://www.youtube.com/watch?v=a-Nba_akPBk&list=PLjwWT1Xy3c4Vd9P7BekMCpqmA49uqAZpx&index=6
Learn Programming By Going To College: https://simpleprogrammer.com/2016/08/15/learning-programming-by-going-to-college/
Other Important Links:
SUPPORT THIS YOUTUBE CHANNEL: vid.io/xokw
Visit: http://simpleprogrammer.com/
Computer Science Vs. Software DevelopmentDegree
So, there are a lot of bragging about what's the difference between computer science and a software development degree. A lot of people want to start a career in software development but they don't know if they should aim for a computer science de...

published: 23 Sep 2016

Is Data Science A Viable Career Path?

I am doing a bachelor of computer science and have not yet chosen a discipline to major in. My lecturer mentioned the disciplines and one grabbed my attention: Data Science. He explained how data science is the future of technology and gave some examples. Do you believe that data science is the future and can you please explain what it is in a little more detail?
-Abraham A.
Schedule a Skype Meeting with Eli: https://silicondiscourse.com
Podcasts of New Videos at SoundCloud: https://soundcloud.com/elithecomputerguy
To AskQuestionsEmail: Question@EliTheComputerGuy.com
For Classes, ClassNotes and Blog Posts:
http://www.EliTheComputerGuy.com
Visit the MainYouTube Channel at:
http://www.YouTube.com/EliTheComputerGuy
Follow us on Twitter at:
http://www.Twitter.com/EliComputerGuy
***...

Computer Science Careers and Subfields

This video will cover computer science careers and as well as different areas within computer science you can dive into. Computer science is a very broad and di...

This video will cover computer science careers and as well as different areas within computer science you can dive into. Computer science is a very broad and diverse field that includes software development, cryptography, cyber security, computer graphics, bioinformatics, and much more.
Although many people imagine a software developer when it comes to computer science, there are many more job titles you can have including security analyst, web developer, network systems administrator, etc.
Many of the jobs listed in this video have the highest employment and growth rate of most jobs on the bureau of labor statistics. This major and career path isn't necessarily for everyone, but it definitely is a good choice.
Lastly for many of the careers discussed in this video, you don't HAVE to have a degree in computer science to get into them. For some career paths you can have a degree in computer engineering, software engineering, information systems, and more. There are even people who are self taught and have landed software developer jobs. Be sure to keep an open mind, but hopefully this provides some insight.
Artificial IntelligenceVideo (Part 1): https://www.youtube.com/watch?v=JRHAM1nAuD4
***************************************************
► For more information on math, science, and engineering majors, check us out at https://majorprep.com
***************************************************
► Patreon: https://patreon.com/majorprep
► PayPal: https://www.paypal.me/majorprep
► Facebook: https://facebook.com/majorprep

This video will cover computer science careers and as well as different areas within computer science you can dive into. Computer science is a very broad and diverse field that includes software development, cryptography, cyber security, computer graphics, bioinformatics, and much more.
Although many people imagine a software developer when it comes to computer science, there are many more job titles you can have including security analyst, web developer, network systems administrator, etc.
Many of the jobs listed in this video have the highest employment and growth rate of most jobs on the bureau of labor statistics. This major and career path isn't necessarily for everyone, but it definitely is a good choice.
Lastly for many of the careers discussed in this video, you don't HAVE to have a degree in computer science to get into them. For some career paths you can have a degree in computer engineering, software engineering, information systems, and more. There are even people who are self taught and have landed software developer jobs. Be sure to keep an open mind, but hopefully this provides some insight.
Artificial IntelligenceVideo (Part 1): https://www.youtube.com/watch?v=JRHAM1nAuD4
***************************************************
► For more information on math, science, and engineering majors, check us out at https://majorprep.com
***************************************************
► Patreon: https://patreon.com/majorprep
► PayPal: https://www.paypal.me/majorprep
► Facebook: https://facebook.com/majorprep

Forging a path into Computing

We spoke to Marwa Salayma about her journey from Palestine to studying for a PhD at EdinburghNapier.
Encouraged by her parents to pursue a career in computing...

We spoke to Marwa Salayma about her journey from Palestine to studying for a PhD at EdinburghNapier.
Encouraged by her parents to pursue a career in computing, she discusses the challenges and encouragement she has encountered along the way and explains why it isn't just for men.
Find out more on our website:
www.napier.ac.uk/computing

We spoke to Marwa Salayma about her journey from Palestine to studying for a PhD at EdinburghNapier.
Encouraged by her parents to pursue a career in computing, she discusses the challenges and encouragement she has encountered along the way and explains why it isn't just for men.
Find out more on our website:
www.napier.ac.uk/computing

Map of Computer Science

The field of computer science summarised. Learn more at this video's sponsor https://brilliant.org/dos
Computer science is the subject that studies what comput...

The field of computer science summarised. Learn more at this video's sponsor https://brilliant.org/dos
Computer science is the subject that studies what computers can do and investigates the best ways you can solve the problems of the world with them. It is a huge field overlapping pure mathematics, engineering and many other scientific disciplines. In this video I summarise as much of the subject as I can and show how the areas are related to each other.
A couple of notes on this video:
1. Some people have commented that I should have included computer security alongside hacking, and I completely agree, that was an oversight on my part. Apologies to all the computer security professionals, and thanks for all the hard work!
2. I also failed to mention interpreters alongside compilers in the complier section. Again, I’m kicking myself because of course this is an important concept for people to hear about. Also the layers of languages being compiled to other languages is overly convoluted, in practice it is more simple than this. I guess I should have picked one simple example.
3. NP-complete problems are possible to solve, they just become very difficult to solve very quickly as they get bigger. When I said NP-complete and then "impossible to solve", I meant that the large NP-complete problems that industry is interested in solving were thought to be practically impossible to solve.
You can buy this poster here: https://www.redbubble.com/people/dominicwalliman/works/27929629-map-of-computer-science?p=poster&finish=semi_gloss&size=small
Get all my other posters here: https://www.redbubble.com/people/dominicwalliman
And free downloadable versions of this and the other posters here. If you want to print them out for educational purposes please do! https://www.flickr.com/photos/95869671@N08/
Thanks so much to my supporters on Patreon. If you enjoy my videos and would like to help me make more this is the best way and I appreciate it very much. https://www.patreon.com/domainofscience
I also write a series of children’s science books call ProfessorAstro Cat, these links are to the publisher, but they are available in all good bookshops around the world in 18 languages and counting:
Frontiers of Space (age 7+): http://nobrow.net/shop/professor-astro-cats-frontiers-of-space/
AtomicAdventure (age 7+): http://nobrow.net/shop/professor-astro-cats-atomic-adventure/
IntergalacticActivityBook (age 7+): http://nobrow.net/shop/professor-astro-cats-intergalactic-activity-book/
Solar System Book (age 3+, available in UK now, and rest of world in spring 2018): http://nobrow.net/shop/professor-astro-cats-solar-system/?
Solar System App: http://www.minilabstudios.com/apps/professor-astro-cats-solar-system/
And the new Professor Astro Cat App: https://itunes.apple.com/us/app/galactic-genius-with-astro-cat/id1212841840?mt=8
Find me on twitter, Instagram, and my website:
http://dominicwalliman.com
https://twitter.com/DominicWalliman
https://www.instagram.com/dominicwalliman
https://www.facebook.com/dominicwalliman

The field of computer science summarised. Learn more at this video's sponsor https://brilliant.org/dos
Computer science is the subject that studies what computers can do and investigates the best ways you can solve the problems of the world with them. It is a huge field overlapping pure mathematics, engineering and many other scientific disciplines. In this video I summarise as much of the subject as I can and show how the areas are related to each other.
A couple of notes on this video:
1. Some people have commented that I should have included computer security alongside hacking, and I completely agree, that was an oversight on my part. Apologies to all the computer security professionals, and thanks for all the hard work!
2. I also failed to mention interpreters alongside compilers in the complier section. Again, I’m kicking myself because of course this is an important concept for people to hear about. Also the layers of languages being compiled to other languages is overly convoluted, in practice it is more simple than this. I guess I should have picked one simple example.
3. NP-complete problems are possible to solve, they just become very difficult to solve very quickly as they get bigger. When I said NP-complete and then "impossible to solve", I meant that the large NP-complete problems that industry is interested in solving were thought to be practically impossible to solve.
You can buy this poster here: https://www.redbubble.com/people/dominicwalliman/works/27929629-map-of-computer-science?p=poster&finish=semi_gloss&size=small
Get all my other posters here: https://www.redbubble.com/people/dominicwalliman
And free downloadable versions of this and the other posters here. If you want to print them out for educational purposes please do! https://www.flickr.com/photos/95869671@N08/
Thanks so much to my supporters on Patreon. If you enjoy my videos and would like to help me make more this is the best way and I appreciate it very much. https://www.patreon.com/domainofscience
I also write a series of children’s science books call ProfessorAstro Cat, these links are to the publisher, but they are available in all good bookshops around the world in 18 languages and counting:
Frontiers of Space (age 7+): http://nobrow.net/shop/professor-astro-cats-frontiers-of-space/
AtomicAdventure (age 7+): http://nobrow.net/shop/professor-astro-cats-atomic-adventure/
IntergalacticActivityBook (age 7+): http://nobrow.net/shop/professor-astro-cats-intergalactic-activity-book/
Solar System Book (age 3+, available in UK now, and rest of world in spring 2018): http://nobrow.net/shop/professor-astro-cats-solar-system/?
Solar System App: http://www.minilabstudios.com/apps/professor-astro-cats-solar-system/
And the new Professor Astro Cat App: https://itunes.apple.com/us/app/galactic-genius-with-astro-cat/id1212841840?mt=8
Find me on twitter, Instagram, and my website:
http://dominicwalliman.com
https://twitter.com/DominicWalliman
https://www.instagram.com/dominicwalliman
https://www.facebook.com/dominicwalliman

Computing Light Paths (ft. Wenzel Jakob)

Surprisingly enough, one of the most challenging parts of computer graphics is simulating light transport. In particular, mathematical tricks are required to al...

Surprisingly enough, one of the most challenging parts of computer graphics is simulating light transport. In particular, mathematical tricks are required to allow for fast and realist imaging. The video features WenzelJakob, Assistant Professor of the IC School at EPFL.
https://people.epfl.ch/wenzel.jakob

Surprisingly enough, one of the most challenging parts of computer graphics is simulating light transport. In particular, mathematical tricks are required to allow for fast and realist imaging. The video features WenzelJakob, Assistant Professor of the IC School at EPFL.
https://people.epfl.ch/wenzel.jakob

How did the Apollo flight computers get men to the moon and back ?

There is much speculation by some, as to how the flight computer aboard the Apollo missions managed to get men to the moon when it had just a tiny fraction of t...

There is much speculation by some, as to how the flight computer aboard the Apollo missions managed to get men to the moon when it had just a tiny fraction of the computing power of something like a modern smartphone.
Patreon : https://www.patreon.com/curiousdroid
You can now translate this and other curious droid videos, see my video about it here https://www.youtube.com/watch?v=xLPVgIytKyg
But this is quite misleading as there was not one solitary computer controlling the Apollo craft, there were 4 computers and no fancy touch screens, GUI’s or other things in a typical computer of today to waste resources on.
The first of the 4 computers was the Saturn Launch Vehicle Digital Computer or LVDC, this got the rocket from the launch pad to earth orbit.
Then there was the Apollo Guidance Computer or AGC, this is the one which most people think of. There were in fact, two of them, one in the Command Module to get from the earth orbit to the moon and back again. The second was in the lunar module that would control the landing and then the ascent back to the command module and docking.
The fourth computer was one which was never used on any mission because it would control an emergency abort and ascent should something happen during the descent to the moons surface like the landing computer failing or they ran out of fuel.
The ApolloGuidanceComputer wasn’t as dumb as many make it out to be. As time when by in Apollo’s development, the tasks that it was meant to do increased in both number and sophistication, this in turn created ever more issues with the limited resources available .
One of the biggest problems was the limited amount of memory due to the technological limitations of the time, this meant that the programmers had to make use of every single byte available.
The AGC also had a unique operating system. Systems like UNIX, Linex, Windows and Apple iOS are in control and share time out to the programs. In the AGC, the programs controlled how much time they got depending on how important they were. So that in the case of an emergency, the highest priority programs would get most of the time and non-essential operations were dropped to free up resources which became the basis of mission critical system for all manned mission afterwards.
The computer had a performance somewhere around that to that of the first generation of personal computers like the Apple II, commodore 64, ZX Spectrum that would arrive 10 years later in the late 70’s.
It had 2k of RAM and 36k of fixed storage magnetic rope core memory, which was woven by hand and took months to assemble, so any software bugs were literally woven into the system.
A comparison between the Apollo Guidance Computer and say an iPhone 6 is tricky because the AGC was not a general purpose computer. It was built for a very specific task, had a unique operating system and with the 48-year gap in the technologies used, we can only really get very rough estimates.
The Apple iPhone 6 uses the ARM A8 processor which has about 1.6 billion transistors in it, the AGC had just 12,300. The iPhone 6 has 1Gb of RAM, about 488,000 times the AGC and in this one, 128Gb of non-volatile storage or about 3.5 million times the AGC.
As for performance, the iPhone 6 is somewhere between maybe 4 and 30 Million times faster than the AGC depending on what type of calculations are being done and if you include the iPhone’s GPU it would be even more.
So, if you had to fly back to the moon in an Apollo craft and given the choice, would you trust your life to a couple of iPhones in place of the AGC’s?, because you would actually have more computing power in just one of them than the whole of NASA had during all the Apollo missions.......
Title: Adam Are You Free?
Author: P CIIISource: www.pipechoir.com
Nightingale sounds from Gerry Gutteridge flic.kr/ps/Mk2zU
License: Creative Commons Attribution4.0

There is much speculation by some, as to how the flight computer aboard the Apollo missions managed to get men to the moon when it had just a tiny fraction of the computing power of something like a modern smartphone.
Patreon : https://www.patreon.com/curiousdroid
You can now translate this and other curious droid videos, see my video about it here https://www.youtube.com/watch?v=xLPVgIytKyg
But this is quite misleading as there was not one solitary computer controlling the Apollo craft, there were 4 computers and no fancy touch screens, GUI’s or other things in a typical computer of today to waste resources on.
The first of the 4 computers was the Saturn Launch Vehicle Digital Computer or LVDC, this got the rocket from the launch pad to earth orbit.
Then there was the Apollo Guidance Computer or AGC, this is the one which most people think of. There were in fact, two of them, one in the Command Module to get from the earth orbit to the moon and back again. The second was in the lunar module that would control the landing and then the ascent back to the command module and docking.
The fourth computer was one which was never used on any mission because it would control an emergency abort and ascent should something happen during the descent to the moons surface like the landing computer failing or they ran out of fuel.
The ApolloGuidanceComputer wasn’t as dumb as many make it out to be. As time when by in Apollo’s development, the tasks that it was meant to do increased in both number and sophistication, this in turn created ever more issues with the limited resources available .
One of the biggest problems was the limited amount of memory due to the technological limitations of the time, this meant that the programmers had to make use of every single byte available.
The AGC also had a unique operating system. Systems like UNIX, Linex, Windows and Apple iOS are in control and share time out to the programs. In the AGC, the programs controlled how much time they got depending on how important they were. So that in the case of an emergency, the highest priority programs would get most of the time and non-essential operations were dropped to free up resources which became the basis of mission critical system for all manned mission afterwards.
The computer had a performance somewhere around that to that of the first generation of personal computers like the Apple II, commodore 64, ZX Spectrum that would arrive 10 years later in the late 70’s.
It had 2k of RAM and 36k of fixed storage magnetic rope core memory, which was woven by hand and took months to assemble, so any software bugs were literally woven into the system.
A comparison between the Apollo Guidance Computer and say an iPhone 6 is tricky because the AGC was not a general purpose computer. It was built for a very specific task, had a unique operating system and with the 48-year gap in the technologies used, we can only really get very rough estimates.
The Apple iPhone 6 uses the ARM A8 processor which has about 1.6 billion transistors in it, the AGC had just 12,300. The iPhone 6 has 1Gb of RAM, about 488,000 times the AGC and in this one, 128Gb of non-volatile storage or about 3.5 million times the AGC.
As for performance, the iPhone 6 is somewhere between maybe 4 and 30 Million times faster than the AGC depending on what type of calculations are being done and if you include the iPhone’s GPU it would be even more.
So, if you had to fly back to the moon in an Apollo craft and given the choice, would you trust your life to a couple of iPhones in place of the AGC’s?, because you would actually have more computing power in just one of them than the whole of NASA had during all the Apollo missions.......
Title: Adam Are You Free?
Author: P CIIISource: www.pipechoir.com
Nightingale sounds from Gerry Gutteridge flic.kr/ps/Mk2zU
License: Creative Commons Attribution4.0

Accelerated Computing: The Path Forward

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing b...

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing business unit, which includes all hardware and software product lines, third-party enablement, and marketing activities for GPU computing. Buck joined NVIDIA in 2004 and created CUDA, which remains the established leading platform for accelerated-based parallel computing. Before joining NVIDIA, he was the development lead on Brook, which was the forerunner to generalized computing on GPUs. Buck holds a Ph.D. in computer science from Stanford University and Bachelor of Science in computer science from Princeton University.
Learn more: http://nvidia.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing business unit, which includes all hardware and software product lines, third-party enablement, and marketing activities for GPU computing. Buck joined NVIDIA in 2004 and created CUDA, which remains the established leading platform for accelerated-based parallel computing. Before joining NVIDIA, he was the development lead on Brook, which was the forerunner to generalized computing on GPUs. Buck holds a Ph.D. in computer science from Stanford University and Bachelor of Science in computer science from Princeton University.
Learn more: http://nvidia.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

GoogleIT SupportProfessionalCertificate
https://www.coursera.org/specializations/google-it-support
This six-course certificate, developed exclusively by Google, includes innovative curriculum designed to prepare you for an entry-level role in IT support. A job in IT can mean in-person or remote help desk work, either in a small business or at a global company, like Google. Whether you’ve been tinkering with IT or are completely new to the field, you’ve come to the right place.
If you’re looking for a job, upon completion of the certificate, you can share your information with top employers, like Bank of America, Walmart, Sprint, GE Digital, PNCBank, Infosys, TEKsystems, UPMC, and, of course, Google.
Through a dynamic mix of video lectures, quizzes, and hand-on labs and widgets, this program will introduce you to troubleshooting and customer service, networking, operating systems, system administration, automation, and security.
Along the way, you’ll hear from Googlers with unique backgrounds and perspectives, whose own foundation in IT support served as a jumping off point for their careers. They’re excited to go on this journey with you, as you invest in your future by achieving an end-of-program certificate.
Course 1 - IT Technical Support Fundamentals
In this course, you’ll be introduced to the world of Information Technology, or IT. This course is the first of a series that aims to prepare you for a role as an entry-level IT Support Specialist. You’ll learn about the different facets of Information Technology, like computer hardware, the Internet, computer software, and job-related skills. You’ll also learn about the history of computers, and the pioneers who shaped the world of computing that we know today. This course covers a wide variety of topics in IT that are designed to give you an overview of what’s to come in this IT Support Professional Certificate. By the end of this course, you’ll be able to: - understand how the binary system works. - assemble a computer from scratch. - choose and install an operating system on a computer. - understand what the Internet is, how it works, and the impact it has in the modern world. - learn how applications are created and how they work under the hood of our computer. - utilize common problem-solving methodologies and soft skills in an Information Technology setting.
Who is this class for: This program is intended for beginners who are interested in developing the skills necessary to perform entry-level IT support. No pre-requisite knowledge is required. However, if you do have some familiarity with IT, you can skip through

GoogleIT SupportProfessionalCertificate
https://www.coursera.org/specializations/google-it-support
This six-course certificate, developed exclusively by Google, includes innovative curriculum designed to prepare you for an entry-level role in IT support. A job in IT can mean in-person or remote help desk work, either in a small business or at a global company, like Google. Whether you’ve been tinkering with IT or are completely new to the field, you’ve come to the right place.
If you’re looking for a job, upon completion of the certificate, you can share your information with top employers, like Bank of America, Walmart, Sprint, GE Digital, PNCBank, Infosys, TEKsystems, UPMC, and, of course, Google.
Through a dynamic mix of video lectures, quizzes, and hand-on labs and widgets, this program will introduce you to troubleshooting and customer service, networking, operating systems, system administration, automation, and security.
Along the way, you’ll hear from Googlers with unique backgrounds and perspectives, whose own foundation in IT support served as a jumping off point for their careers. They’re excited to go on this journey with you, as you invest in your future by achieving an end-of-program certificate.
Course 1 - IT Technical Support Fundamentals
In this course, you’ll be introduced to the world of Information Technology, or IT. This course is the first of a series that aims to prepare you for a role as an entry-level IT Support Specialist. You’ll learn about the different facets of Information Technology, like computer hardware, the Internet, computer software, and job-related skills. You’ll also learn about the history of computers, and the pioneers who shaped the world of computing that we know today. This course covers a wide variety of topics in IT that are designed to give you an overview of what’s to come in this IT Support Professional Certificate. By the end of this course, you’ll be able to: - understand how the binary system works. - assemble a computer from scratch. - choose and install an operating system on a computer. - understand what the Internet is, how it works, and the impact it has in the modern world. - learn how applications are created and how they work under the hood of our computer. - utilize common problem-solving methodologies and soft skills in an Information Technology setting.
Who is this class for: This program is intended for beginners who are interested in developing the skills necessary to perform entry-level IT support. No pre-requisite knowledge is required. However, if you do have some familiarity with IT, you can skip through

Career Opportunities in Cloud Computing

This course describes 6 different positions in the cloud computing industry, all of which are in high demand as established companies move to cloud technologies...

This course describes 6 different positions in the cloud computing industry, all of which are in high demand as established companies move to cloud technologies, companies expand IT departments, and new companies form. There’s great opportunity for fulfilling work with great pay. Overall it’s a great time to be in the cloud computing industry (which is rapidly becoming the IT industry).
To enjoy the entire learning path: https://goo.gl/4VajyB
You can also have access to the entire CloudAcademy content library with a 7-day free trial at https://goo.gl/8lu43D.

This course describes 6 different positions in the cloud computing industry, all of which are in high demand as established companies move to cloud technologies, companies expand IT departments, and new companies form. There’s great opportunity for fulfilling work with great pay. Overall it’s a great time to be in the cloud computing industry (which is rapidly becoming the IT industry).
To enjoy the entire learning path: https://goo.gl/4VajyB
You can also have access to the entire CloudAcademy content library with a 7-day free trial at https://goo.gl/8lu43D.

This presentation was recorded at GOTOAarhus2013http://gotocon.com
Jeff Hawkins - BrainInspired Computing
ABSTRACT
Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe recent advances in neocortical theory, including why the brain uses sparse distributed representations and how the brain makes predictions from high velocity sensory data streams.
I will demonstrate a product called Grok, that uses a detailed model of neocortical memory to act on machine generated data and how developers can contribute to the development of intelligent machines via the NuPIC open source project (www.numenta.org).
https://twitter.com/gotocon
https://www.facebook.com/GOTOConference
http://gotocon.com

This presentation was recorded at GOTOAarhus2013http://gotocon.com
Jeff Hawkins - BrainInspired Computing
ABSTRACT
Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe recent advances in neocortical theory, including why the brain uses sparse distributed representations and how the brain makes predictions from high velocity sensory data streams.
I will demonstrate a product called Grok, that uses a detailed model of neocortical memory to act on machine generated data and how developers can contribute to the development of intelligent machines via the NuPIC open source project (www.numenta.org).
https://twitter.com/gotocon
https://www.facebook.com/GOTOConference
http://gotocon.com

CarrierSDN platforms such as the NokiaNetwork Services Platform enable new and exciting possibilities for wide area network optimization.
This video explains the application of the Self-tuned Adaptive RoutingAlgorithm developed by Nokia Bell Labs for centralized path computation in IP/MPLS networks.
For more information, please visit our IP/optical integration solution page https://networks.nokia.com/solutions/ip-optical-integration

CarrierSDN platforms such as the NokiaNetwork Services Platform enable new and exciting possibilities for wide area network optimization.
This video explains the application of the Self-tuned Adaptive RoutingAlgorithm developed by Nokia Bell Labs for centralized path computation in IP/MPLS networks.
For more information, please visit our IP/optical integration solution page https://networks.nokia.com/solutions/ip-optical-integration

► FREE COURSE - 5 LearningMistakesSoftware Developers Make ◄
https://simpleprogrammer.com/learn-faster
SUBSCRIBE TO THIS CHANNEL: vid.io/xokz
Interview With Matt From Engineered Truth: https://www.youtube.com/watch?v=a-Nba_akPBk&list=PLjwWT1Xy3c4Vd9P7BekMCpqmA49uqAZpx&index=6
Learn Programming By Going To College: https://simpleprogrammer.com/2016/08/15/learning-programming-by-going-to-college/
Other Important Links:
SUPPORT THIS YOUTUBE CHANNEL: vid.io/xokw
Visit: http://simpleprogrammer.com/
Computer Science Vs. Software DevelopmentDegree
So, there are a lot of bragging about what's the difference between computer science and a software development degree. A lot of people want to start a career in software development but they don't know if they should aim for a computer science degree or a software development degree.
I often receive a lot of emails from people asking me about a computer science degree. They want to become software developers but they don't know which path they should pursue. Is a computer science a good degree for becoming a software development?
What a computer science degree can offer you that a software development degree can't? What things you should get with a computer science degree that are important that you wouldn't get with a regular software development degree?
Watch this video and find out!
If you have a question, email me at john@simpleprogrammer.com
If you liked this video, share, like and, of course, subscribe!
Subscribe To My YouTube Channel: http://bit.ly/1zPTNLT
Visit Simple Programmer Website: http://simpleprogrammer.com/
Connect with me on social media:
Facebook: https://www.facebook.com/SimpleProgrammer
Twitter: https://twitter.com/jsonmez
Other Links:
Sign up for the Simple Programmer Newsletter: http://simpleprogrammer.com/email
Simple Programmer blog: http://simpleprogrammer.com/blog
Learn how to learn anything quickly: http://10stepstolearn.com
Boost your career now: http://devcareerboost.com

► FREE COURSE - 5 LearningMistakesSoftware Developers Make ◄
https://simpleprogrammer.com/learn-faster
SUBSCRIBE TO THIS CHANNEL: vid.io/xokz
Interview With Matt From Engineered Truth: https://www.youtube.com/watch?v=a-Nba_akPBk&list=PLjwWT1Xy3c4Vd9P7BekMCpqmA49uqAZpx&index=6
Learn Programming By Going To College: https://simpleprogrammer.com/2016/08/15/learning-programming-by-going-to-college/
Other Important Links:
SUPPORT THIS YOUTUBE CHANNEL: vid.io/xokw
Visit: http://simpleprogrammer.com/
Computer Science Vs. Software DevelopmentDegree
So, there are a lot of bragging about what's the difference between computer science and a software development degree. A lot of people want to start a career in software development but they don't know if they should aim for a computer science degree or a software development degree.
I often receive a lot of emails from people asking me about a computer science degree. They want to become software developers but they don't know which path they should pursue. Is a computer science a good degree for becoming a software development?
What a computer science degree can offer you that a software development degree can't? What things you should get with a computer science degree that are important that you wouldn't get with a regular software development degree?
Watch this video and find out!
If you have a question, email me at john@simpleprogrammer.com
If you liked this video, share, like and, of course, subscribe!
Subscribe To My YouTube Channel: http://bit.ly/1zPTNLT
Visit Simple Programmer Website: http://simpleprogrammer.com/
Connect with me on social media:
Facebook: https://www.facebook.com/SimpleProgrammer
Twitter: https://twitter.com/jsonmez
Other Links:
Sign up for the Simple Programmer Newsletter: http://simpleprogrammer.com/email
Simple Programmer blog: http://simpleprogrammer.com/blog
Learn how to learn anything quickly: http://10stepstolearn.com
Boost your career now: http://devcareerboost.com

I am doing a bachelor of computer science and have not yet chosen a discipline to major in. My lecturer mentioned the disciplines and one grabbed my attention: Data Science. He explained how data science is the future of technology and gave some examples. Do you believe that data science is the future and can you please explain what it is in a little more detail?
-Abraham A.
Schedule a Skype Meeting with Eli: https://silicondiscourse.com
Podcasts of New Videos at SoundCloud: https://soundcloud.com/elithecomputerguy
To AskQuestionsEmail: Question@EliTheComputerGuy.com
For Classes, ClassNotes and Blog Posts:
http://www.EliTheComputerGuy.com
Visit the MainYouTube Channel at:
http://www.YouTube.com/EliTheComputerGuy
Follow us on Twitter at:
http://www.Twitter.com/EliComputerGuy
**********

I am doing a bachelor of computer science and have not yet chosen a discipline to major in. My lecturer mentioned the disciplines and one grabbed my attention: Data Science. He explained how data science is the future of technology and gave some examples. Do you believe that data science is the future and can you please explain what it is in a little more detail?
-Abraham A.
Schedule a Skype Meeting with Eli: https://silicondiscourse.com
Podcasts of New Videos at SoundCloud: https://soundcloud.com/elithecomputerguy
To AskQuestionsEmail: Question@EliTheComputerGuy.com
For Classes, ClassNotes and Blog Posts:
http://www.EliTheComputerGuy.com
Visit the MainYouTube Channel at:
http://www.YouTube.com/EliTheComputerGuy
Follow us on Twitter at:
http://www.Twitter.com/EliComputerGuy
**********

Transform yourself and build your IT cloud career path

Want to evolve your career to a role in cloud technologies? Learn how to transition from on-premises IT to cloud computing, what technology skills you will need as well as, the soft skills required to succeed. Receive practical advice on mapping out your career journey including industry input and success stories from IT folks just like you who have transitioned. We explore resources available to help you on this career journey such as, the IT ProCareer Center, IT Pro CloudEssentials, community and more. https://ignite.microsoft.com/

This presentation was recorded at GOTOAarhus2013http://gotocon.com
Jeff Hawkins - BrainInspired Computing
ABSTRACT
Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe recent advances in neocortical theory, including why the brain uses sparse distributed representations and how the brain makes predictions from high velocity sensory data streams.
I will demonstrate a product called Grok, that uses a detailed model of neocortical memory to act on machine generated data and how developers can contribute to the development of intelligent machines via the NuPIC open source project (www.numenta.org).
https://twitter.com/gotocon
https://www.facebook.com/GOTOConference
http://gotoco...

published: 08 Apr 2014

Accelerated Computing: The Path Forward

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing business unit, which includes all hardware and software product lines, third-party enablement, and marketing activities for GPU computing. Buck joined NVIDIA in 2004 and created CUDA, which remains the established leading platform for accelerated-based parallel computing. Before joining NVIDIA, he was the development lead on Brook, which was the forerunner to generalized computing on GPUs. Buck holds a Ph.D. in computer science from Stanford University and Bachelor of Science in computer science from Princeton University.
Learn more: http://nvidia.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

Computing Beyond Turing - Jeff Hawkins

Coaxing computers to perform basic acts of perception and robotics, let alone high-level thought, has been difficult. No existing computer can recognize pictures, understand language, or navigate through a cluttered room with anywhere near the facility of a child. Hawkins and his colleagues have developed a model of how the neocortex performs these and other tasks. The theory, call Hierarchical Temporal Memory, explains how the hierarchical structure of the neocortex builds a model of its world and uses this model for inference and prediction. To turn this theory into a useful technology, Hawkins has created a company called Numenta. In this talk, Hawkins will describe the theory, its biological basis, and a software platform created by Numenta that allows anyone to apply this theory to a...

Password Cracking - Computerphile

'Beast' cracks billions of passwords a second, Dr MikePound demonstrates why you should probably change your passwords...Please note,at one point during the video Mike suggests using SHA512. Please check whatever the recommended process is at the time you view the video.
How NOT to Store Passwords: https://youtu.be/8ZtInClXe1Q
PasswordChoice: https://youtu.be/3NjQ9b3pgIg
Deep Learning: https://youtu.be/l42lr8AlrHk
CookieStealing: https://youtu.be/T1QEs3mdJoc
http://www.facebook.com/computerphile
https://twitter.com/computer_phile
This video was filmed and edited by Sean Riley.
Computer Science at the University of Nottingham: http://bit.ly/nottscomputer
Computerphile is a sister project to Brady Haran's Numberphile. More at http://www.bradyharan.com

Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe recent advances in neocortical theory, including why the brain uses sparse distributed representations and how the brain makes predictions from high velocity sensory data streams.
I will demonstrate a product called Grok, that uses a detailed model of neocortical memory to act on machine generated data and how developers can contribute to the development of intelligent machines via the NuPIC open source project (www.numenta.org).
I will end my talk with a discussion of the future of machine intelligence.
Jeff Hawkins is co-founder of Grok (formerly known as Numenta). Jeff is an entrepreneur, scientist, and author. He was a found...

Atlantis Computing: A Simpler Path to VDI Success

published: 10 Sep 2013

"Uncle" Bob Martin - "The Future of Programming"

How did our industry start, what paths did it take to get to where we are, and where is it going. What big problems did programmers encounter in the past? How were they solved? And how do those solutions impact our future? What mistakes have we made as a profession; and how are we going to correct them. In this talk, Uncle Bob describes the history of software, from it’s beginnings in 1948 up through the current day; and then beyond. By looking at our past trajectory, we try to plot out where our profession is headed, and what challenges we’ll face along the way.
Robert C. Martin (Uncle Bob) has been a programmer since 1970. He is the MasterCraftsman at 8th Light inc, an acclaimed speaker at conferences worldwide, and the author of many books including: The Clean Coder, Clean Code, Agile...

EE380: Computer Systems Colloquium Seminar
Multiscale Dataflow Computing: Competitive Advantage at the Exascale Frontier
Speaker: Brian Boucher, Maxeler Technologies
Maxeler Multiscale Dataflow computing is at the leading edge of energy-efficient high performance computing, providing competitive advantage in industries from energy to finance to defense. Maxeler builds the computer around the problem to maximize performance density, eliminating the elaborate caching and decoding machinery occupying most silicon in a standard processor. This talk will explain the motivation behind dataflow computing to escape the end of frequency scaling in the push to exascale machines, introduce the Maxeler dataflow ecosystem including MaxJ code and DFE hardware, and demonstrate the application of data...

published: 28 Sep 2017

Ray Kurzweil - The Path to The Singularity

Ray Kurzweil is an American author, computer scientist, inventor and futurist. Aside from futurism, he is involved in fields such as optical character recognition (OCR), text-to-speech synthesis, speech recognition technology, and electronic keyboard instruments. He has written books on health, artificial intelligence (AI), transhumanism, the technological singularity, and futurism. Kurzweil is a public advocate for the futurist and transhumanist movements, and gives public talks to share his optimistic outlook on life extension technologies and the future of nanotechnology, robotics, and biotechnology.
Recorded: 2016

Transform yourself and build your IT cloud career path

Want to evolve your career to a role in cloud technologies? Learn how to transition from on-premises IT to cloud computing, what technology skills you will nee...

Want to evolve your career to a role in cloud technologies? Learn how to transition from on-premises IT to cloud computing, what technology skills you will need as well as, the soft skills required to succeed. Receive practical advice on mapping out your career journey including industry input and success stories from IT folks just like you who have transitioned. We explore resources available to help you on this career journey such as, the IT ProCareer Center, IT Pro CloudEssentials, community and more. https://ignite.microsoft.com/

Want to evolve your career to a role in cloud technologies? Learn how to transition from on-premises IT to cloud computing, what technology skills you will need as well as, the soft skills required to succeed. Receive practical advice on mapping out your career journey including industry input and success stories from IT folks just like you who have transitioned. We explore resources available to help you on this career journey such as, the IT ProCareer Center, IT Pro CloudEssentials, community and more. https://ignite.microsoft.com/

This presentation was recorded at GOTOAarhus2013http://gotocon.com
Jeff Hawkins - BrainInspired Computing
ABSTRACT
Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe recent advances in neocortical theory, including why the brain uses sparse distributed representations and how the brain makes predictions from high velocity sensory data streams.
I will demonstrate a product called Grok, that uses a detailed model of neocortical memory to act on machine generated data and how developers can contribute to the development of intelligent machines via the NuPIC open source project (www.numenta.org).
https://twitter.com/gotocon
https://www.facebook.com/GOTOConference
http://gotocon.com

This presentation was recorded at GOTOAarhus2013http://gotocon.com
Jeff Hawkins - BrainInspired Computing
ABSTRACT
Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe recent advances in neocortical theory, including why the brain uses sparse distributed representations and how the brain makes predictions from high velocity sensory data streams.
I will demonstrate a product called Grok, that uses a detailed model of neocortical memory to act on machine generated data and how developers can contribute to the development of intelligent machines via the NuPIC open source project (www.numenta.org).
https://twitter.com/gotocon
https://www.facebook.com/GOTOConference
http://gotocon.com

Accelerated Computing: The Path Forward

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing b...

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing business unit, which includes all hardware and software product lines, third-party enablement, and marketing activities for GPU computing. Buck joined NVIDIA in 2004 and created CUDA, which remains the established leading platform for accelerated-based parallel computing. Before joining NVIDIA, he was the development lead on Brook, which was the forerunner to generalized computing on GPUs. Buck holds a Ph.D. in computer science from Stanford University and Bachelor of Science in computer science from Princeton University.
Learn more: http://nvidia.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing business unit, which includes all hardware and software product lines, third-party enablement, and marketing activities for GPU computing. Buck joined NVIDIA in 2004 and created CUDA, which remains the established leading platform for accelerated-based parallel computing. Before joining NVIDIA, he was the development lead on Brook, which was the forerunner to generalized computing on GPUs. Buck holds a Ph.D. in computer science from Stanford University and Bachelor of Science in computer science from Princeton University.
Learn more: http://nvidia.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

Computing Beyond Turing - Jeff Hawkins

Coaxing computers to perform basic acts of perception and robotics, let alone high-level thought, has been difficult. No existing computer can recognize picture...

Coaxing computers to perform basic acts of perception and robotics, let alone high-level thought, has been difficult. No existing computer can recognize pictures, understand language, or navigate through a cluttered room with anywhere near the facility of a child. Hawkins and his colleagues have developed a model of how the neocortex performs these and other tasks. The theory, call Hierarchical Temporal Memory, explains how the hierarchical structure of the neocortex builds a model of its world and uses this model for inference and prediction. To turn this theory into a useful technology, Hawkins has created a company called Numenta. In this talk, Hawkins will describe the theory, its biological basis, and a software platform created by Numenta that allows anyone to apply this theory to a variety of problems. Part of this theory was described in Hawkins' 2004 book, "On Intelligence".
This talk is by the Chairman of the Redwood Neuroscience Institute and co-founder of Palm Computing and Handspring, and is co-sponsored by Calit2 at UCSD, the Jacobs School's Computer Science and Engineering (CSE)department, and the Institute for Neural Computation (INC).

Coaxing computers to perform basic acts of perception and robotics, let alone high-level thought, has been difficult. No existing computer can recognize pictures, understand language, or navigate through a cluttered room with anywhere near the facility of a child. Hawkins and his colleagues have developed a model of how the neocortex performs these and other tasks. The theory, call Hierarchical Temporal Memory, explains how the hierarchical structure of the neocortex builds a model of its world and uses this model for inference and prediction. To turn this theory into a useful technology, Hawkins has created a company called Numenta. In this talk, Hawkins will describe the theory, its biological basis, and a software platform created by Numenta that allows anyone to apply this theory to a variety of problems. Part of this theory was described in Hawkins' 2004 book, "On Intelligence".
This talk is by the Chairman of the Redwood Neuroscience Institute and co-founder of Palm Computing and Handspring, and is co-sponsored by Calit2 at UCSD, the Jacobs School's Computer Science and Engineering (CSE)department, and the Institute for Neural Computation (INC).

Password Cracking - Computerphile

'Beast' cracks billions of passwords a second, Dr MikePound demonstrates why you should probably change your passwords...Please note,at one point during the...

'Beast' cracks billions of passwords a second, Dr MikePound demonstrates why you should probably change your passwords...Please note,at one point during the video Mike suggests using SHA512. Please check whatever the recommended process is at the time you view the video.
How NOT to Store Passwords: https://youtu.be/8ZtInClXe1Q
PasswordChoice: https://youtu.be/3NjQ9b3pgIg
Deep Learning: https://youtu.be/l42lr8AlrHk
CookieStealing: https://youtu.be/T1QEs3mdJoc
http://www.facebook.com/computerphile
https://twitter.com/computer_phile
This video was filmed and edited by Sean Riley.
Computer Science at the University of Nottingham: http://bit.ly/nottscomputer
Computerphile is a sister project to Brady Haran's Numberphile. More at http://www.bradyharan.com

'Beast' cracks billions of passwords a second, Dr MikePound demonstrates why you should probably change your passwords...Please note,at one point during the video Mike suggests using SHA512. Please check whatever the recommended process is at the time you view the video.
How NOT to Store Passwords: https://youtu.be/8ZtInClXe1Q
PasswordChoice: https://youtu.be/3NjQ9b3pgIg
Deep Learning: https://youtu.be/l42lr8AlrHk
CookieStealing: https://youtu.be/T1QEs3mdJoc
http://www.facebook.com/computerphile
https://twitter.com/computer_phile
This video was filmed and edited by Sean Riley.
Computer Science at the University of Nottingham: http://bit.ly/nottscomputer
Computerphile is a sister project to Brady Haran's Numberphile. More at http://www.bradyharan.com

2016 Isaac Asimov Memorial Debate: Is the Universe a Simulation?

What may have started as a science fiction speculation—that perhaps the universe as we know it is a computer simulation—has become a serious line of theoretical...

What may have started as a science fiction speculation—that perhaps the universe as we know it is a computer simulation—has become a serious line of theoretical and experimental investigation among physicists, astrophysicists, and philosophers.
Neil deGrasse Tyson, Frederick P. RoseDirector of the Hayden Planetarium, hosts and moderates a panel of experts in a lively discussion about the merits and shortcomings of this provocative and revolutionary idea. The 17th annual Isaac AsimovMemorial Debate took place at The American Museum of Natural History on April 5, 2016.
2016 Asimov Panelists:
David ChalmersProfessor of philosophy, New York University
Zohreh Davoudi
Theoretical physicist, Massachusetts Institute of TechnologyJamesGates
Theoretical physicist, University of MarylandLisa Randall
Theoretical physicist, Harvard UniversityMax Tegmark
Cosmologist, Massachusetts Institute of Technology
The late Dr. Isaac Asimov, one of the most prolific and influential authors of our time, was a dear friend and supporter of the American Museum of Natural History. In his memory, the Hayden Planetarium is honored to host the annual Isaac Asimov Memorial Debate — generously endowed by relatives, friends, and admirers of Isaac Asimov and his work — bringing the finest minds in the world to the Museum each year to debate pressing questions on the frontier of scientific discovery. Proceeds from ticket sales of the Isaac Asimov Memorial Debates benefit the scientific and educational programs of the Hayden Planetarium.
2017 Isaac Asimov Memorial Debate: De-Extinction
https://www.youtube.com/watch?v=_LnAtMeSVeY&index=1&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2016 Isaac Asimov Memorial Debate: Is the Universe a Simulation?
https://www.youtube.com/watch?v=wgSZA3NPpBs&index=2&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2015 Isaac Asimov Memorial Debate: Water, Water
https://www.youtube.com/watch?v=FSF79uS3t04&index=3&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2014 Isaac Asimov Memorial Debate: Selling Space
https://www.youtube.com/watch?v=GbmFeEIKBFI&index=4&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2013 Isaac Asimov Memorial Debate: The Existence of Nothing
https://www.youtube.com/watch?v=1OLz6uUuMp8&index=5&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2012 Isaac Asimov Memorial Debate: Faster Than the Speed of Light
https://www.youtube.com/watch?v=5qlLW60wOjo&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ&index=6
2011 Isaac Asimov Memorial Debate: The Theory of Everything
https://www.youtube.com/watch?v=Eb8_3BUHcuw&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ&index=7
Rose CenterAnniversary Isaac Asimov Debate: Is EarthUnique?
https://www.youtube.com/watch?v=9Ji_GdAk9vU&index=8&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ&t=25s
***
Subscribe to our channel:
http://www.youtube.com/subscription_c...Check out our full video catalog:
http://www.youtube.com/user/AMNHorg
Facebook: http://fb.com/naturalhistory
Twitter: http://twitter.com/amnh
Tumblr: http://amnhnyc.tumblr.com/
Instagram: http://instagram.com/amnh

What may have started as a science fiction speculation—that perhaps the universe as we know it is a computer simulation—has become a serious line of theoretical and experimental investigation among physicists, astrophysicists, and philosophers.
Neil deGrasse Tyson, Frederick P. RoseDirector of the Hayden Planetarium, hosts and moderates a panel of experts in a lively discussion about the merits and shortcomings of this provocative and revolutionary idea. The 17th annual Isaac AsimovMemorial Debate took place at The American Museum of Natural History on April 5, 2016.
2016 Asimov Panelists:
David ChalmersProfessor of philosophy, New York University
Zohreh Davoudi
Theoretical physicist, Massachusetts Institute of TechnologyJamesGates
Theoretical physicist, University of MarylandLisa Randall
Theoretical physicist, Harvard UniversityMax Tegmark
Cosmologist, Massachusetts Institute of Technology
The late Dr. Isaac Asimov, one of the most prolific and influential authors of our time, was a dear friend and supporter of the American Museum of Natural History. In his memory, the Hayden Planetarium is honored to host the annual Isaac Asimov Memorial Debate — generously endowed by relatives, friends, and admirers of Isaac Asimov and his work — bringing the finest minds in the world to the Museum each year to debate pressing questions on the frontier of scientific discovery. Proceeds from ticket sales of the Isaac Asimov Memorial Debates benefit the scientific and educational programs of the Hayden Planetarium.
2017 Isaac Asimov Memorial Debate: De-Extinction
https://www.youtube.com/watch?v=_LnAtMeSVeY&index=1&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2016 Isaac Asimov Memorial Debate: Is the Universe a Simulation?
https://www.youtube.com/watch?v=wgSZA3NPpBs&index=2&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2015 Isaac Asimov Memorial Debate: Water, Water
https://www.youtube.com/watch?v=FSF79uS3t04&index=3&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2014 Isaac Asimov Memorial Debate: Selling Space
https://www.youtube.com/watch?v=GbmFeEIKBFI&index=4&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2013 Isaac Asimov Memorial Debate: The Existence of Nothing
https://www.youtube.com/watch?v=1OLz6uUuMp8&index=5&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2012 Isaac Asimov Memorial Debate: Faster Than the Speed of Light
https://www.youtube.com/watch?v=5qlLW60wOjo&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ&index=6
2011 Isaac Asimov Memorial Debate: The Theory of Everything
https://www.youtube.com/watch?v=Eb8_3BUHcuw&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ&index=7
Rose CenterAnniversary Isaac Asimov Debate: Is EarthUnique?
https://www.youtube.com/watch?v=9Ji_GdAk9vU&index=8&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ&t=25s
***
Subscribe to our channel:
http://www.youtube.com/subscription_c...Check out our full video catalog:
http://www.youtube.com/user/AMNHorg
Facebook: http://fb.com/naturalhistory
Twitter: http://twitter.com/amnh
Tumblr: http://amnhnyc.tumblr.com/
Instagram: http://instagram.com/amnh

Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe...

Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe recent advances in neocortical theory, including why the brain uses sparse distributed representations and how the brain makes predictions from high velocity sensory data streams.
I will demonstrate a product called Grok, that uses a detailed model of neocortical memory to act on machine generated data and how developers can contribute to the development of intelligent machines via the NuPIC open source project (www.numenta.org).
I will end my talk with a discussion of the future of machine intelligence.
Jeff Hawkins is co-founder of Grok (formerly known as Numenta). Jeff is an entrepreneur, scientist, and author. He was a founder of two mobile computing companies, Palm and Handspring, and was the architect of many computing products such as the PalmPilot and Treo smartphone. Throughout his life Jeff has also been active in neuroscience and modeling the neocortex. In 2002 he founded the Redwood Neuroscience Institute, a scientific institute focused on understanding how the neocortex processes information. The institute is now located at U.C. Berkeley. In 2004 he wrote the book On Intelligence, which describes progress on understanding the neocortex. In 2005 he co-founded Numenta (now Grok), a start-up company building a technology based on neocortical theory. It is his hope that Grok will play a catalytic role in the emerging field of machine intelligence.
Jeff Hawkins earned his B.S. in electrical engineering from Cornell University in 1979. He was elected to the National Academy of Engineering in 2003
For more on YOW! Conference, visit http://www.yowconference.com.au

Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe recent advances in neocortical theory, including why the brain uses sparse distributed representations and how the brain makes predictions from high velocity sensory data streams.
I will demonstrate a product called Grok, that uses a detailed model of neocortical memory to act on machine generated data and how developers can contribute to the development of intelligent machines via the NuPIC open source project (www.numenta.org).
I will end my talk with a discussion of the future of machine intelligence.
Jeff Hawkins is co-founder of Grok (formerly known as Numenta). Jeff is an entrepreneur, scientist, and author. He was a founder of two mobile computing companies, Palm and Handspring, and was the architect of many computing products such as the PalmPilot and Treo smartphone. Throughout his life Jeff has also been active in neuroscience and modeling the neocortex. In 2002 he founded the Redwood Neuroscience Institute, a scientific institute focused on understanding how the neocortex processes information. The institute is now located at U.C. Berkeley. In 2004 he wrote the book On Intelligence, which describes progress on understanding the neocortex. In 2005 he co-founded Numenta (now Grok), a start-up company building a technology based on neocortical theory. It is his hope that Grok will play a catalytic role in the emerging field of machine intelligence.
Jeff Hawkins earned his B.S. in electrical engineering from Cornell University in 1979. He was elected to the National Academy of Engineering in 2003
For more on YOW! Conference, visit http://www.yowconference.com.au

"Uncle" Bob Martin - "The Future of Programming"

How did our industry start, what paths did it take to get to where we are, and where is it going. What big problems did programmers encounter in the past? How w...

How did our industry start, what paths did it take to get to where we are, and where is it going. What big problems did programmers encounter in the past? How were they solved? And how do those solutions impact our future? What mistakes have we made as a profession; and how are we going to correct them. In this talk, Uncle Bob describes the history of software, from it’s beginnings in 1948 up through the current day; and then beyond. By looking at our past trajectory, we try to plot out where our profession is headed, and what challenges we’ll face along the way.
Robert C. Martin (Uncle Bob) has been a programmer since 1970. He is the MasterCraftsman at 8th Light inc, an acclaimed speaker at conferences worldwide, and the author of many books including: The Clean Coder, Clean Code, Agile Software Development: Principles, Patterns, and Practices, and UML for Java Programmers.

How did our industry start, what paths did it take to get to where we are, and where is it going. What big problems did programmers encounter in the past? How were they solved? And how do those solutions impact our future? What mistakes have we made as a profession; and how are we going to correct them. In this talk, Uncle Bob describes the history of software, from it’s beginnings in 1948 up through the current day; and then beyond. By looking at our past trajectory, we try to plot out where our profession is headed, and what challenges we’ll face along the way.
Robert C. Martin (Uncle Bob) has been a programmer since 1970. He is the MasterCraftsman at 8th Light inc, an acclaimed speaker at conferences worldwide, and the author of many books including: The Clean Coder, Clean Code, Agile Software Development: Principles, Patterns, and Practices, and UML for Java Programmers.

EE380: Computer Systems Colloquium Seminar
Multiscale Dataflow Computing: Competitive Advantage at the Exascale Frontier
Speaker: Brian Boucher, Maxeler Technologies
Maxeler Multiscale Dataflow computing is at the leading edge of energy-efficient high performance computing, providing competitive advantage in industries from energy to finance to defense. Maxeler builds the computer around the problem to maximize performance density, eliminating the elaborate caching and decoding machinery occupying most silicon in a standard processor. This talk will explain the motivation behind dataflow computing to escape the end of frequency scaling in the push to exascale machines, introduce the Maxeler dataflow ecosystem including MaxJ code and DFE hardware, and demonstrate the application of dataflow principles to a specific HPC software package (Quantum ESPRESSO).
About the Speaker:
Brian Boucher holds a B.S. in Mathematics and Chemistry from Jacksonville University and an M.S. in Mathematics from the University of Florida, and is a credentialed Fellow of the Society of Actuaries. Brian left a career in finance last year to join Maxeler as a dataflow architect, where his current projects include real-time risk and high-performance computing for PRACE (the Partnership for Advanced Computing in Europe).
For more information about this seminar and its speaker, you can visit http://ee380.stanford.edu/Abstracts/170927.htmlSupport for the Stanford Colloquium on Computer Systems Seminar Series provided by the Stanford ComputerForum.
Colloquium on Computer Systems Seminar Series (EE380) presents the current research in design, implementation, analysis, and use of computer systems. Topics range from integrated circuits to operating systems and programming languages. It is free and open to the public, with new lectures each week.
Learn more: http://bit.ly/WinYX5

EE380: Computer Systems Colloquium Seminar
Multiscale Dataflow Computing: Competitive Advantage at the Exascale Frontier
Speaker: Brian Boucher, Maxeler Technologies
Maxeler Multiscale Dataflow computing is at the leading edge of energy-efficient high performance computing, providing competitive advantage in industries from energy to finance to defense. Maxeler builds the computer around the problem to maximize performance density, eliminating the elaborate caching and decoding machinery occupying most silicon in a standard processor. This talk will explain the motivation behind dataflow computing to escape the end of frequency scaling in the push to exascale machines, introduce the Maxeler dataflow ecosystem including MaxJ code and DFE hardware, and demonstrate the application of dataflow principles to a specific HPC software package (Quantum ESPRESSO).
About the Speaker:
Brian Boucher holds a B.S. in Mathematics and Chemistry from Jacksonville University and an M.S. in Mathematics from the University of Florida, and is a credentialed Fellow of the Society of Actuaries. Brian left a career in finance last year to join Maxeler as a dataflow architect, where his current projects include real-time risk and high-performance computing for PRACE (the Partnership for Advanced Computing in Europe).
For more information about this seminar and its speaker, you can visit http://ee380.stanford.edu/Abstracts/170927.htmlSupport for the Stanford Colloquium on Computer Systems Seminar Series provided by the Stanford ComputerForum.
Colloquium on Computer Systems Seminar Series (EE380) presents the current research in design, implementation, analysis, and use of computer systems. Topics range from integrated circuits to operating systems and programming languages. It is free and open to the public, with new lectures each week.
Learn more: http://bit.ly/WinYX5

Ray Kurzweil - The Path to The Singularity

Ray Kurzweil is an American author, computer scientist, inventor and futurist. Aside from futurism, he is involved in fields such as optical character recogniti...

Ray Kurzweil is an American author, computer scientist, inventor and futurist. Aside from futurism, he is involved in fields such as optical character recognition (OCR), text-to-speech synthesis, speech recognition technology, and electronic keyboard instruments. He has written books on health, artificial intelligence (AI), transhumanism, the technological singularity, and futurism. Kurzweil is a public advocate for the futurist and transhumanist movements, and gives public talks to share his optimistic outlook on life extension technologies and the future of nanotechnology, robotics, and biotechnology.
Recorded: 2016

Ray Kurzweil is an American author, computer scientist, inventor and futurist. Aside from futurism, he is involved in fields such as optical character recognition (OCR), text-to-speech synthesis, speech recognition technology, and electronic keyboard instruments. He has written books on health, artificial intelligence (AI), transhumanism, the technological singularity, and futurism. Kurzweil is a public advocate for the futurist and transhumanist movements, and gives public talks to share his optimistic outlook on life extension technologies and the future of nanotechnology, robotics, and biotechnology.
Recorded: 2016

Computer Science Careers and Subfields

This video will cover computer science careers and as well as different areas within computer science you can dive into. Computer science is a very broad and diverse field that includes software development, cryptography, cyber security, computer graphics, bioinformatics, and much more.
Although many people imagine a software developer when it comes to computer science, there are many more job titles you can have including security analyst, web developer, network systems administrator, etc.
Many of the jobs listed in this video have the highest employment and growth rate of most jobs on the bureau of labor statistics. This major and career path isn't necessarily for everyone, but it definitely is a good choice.
Lastly for many of the careers discussed in this video, you don't HAVE to have a degree in computer science to get into them. For some career paths you can have a degree in computer engineering, software engineering, information systems, and more. There are even people who are self taught and have landed software developer jobs. Be sure to keep an open mind, but hopefully this provides some insight.
Artificial IntelligenceVideo (Part 1): https://www.youtube.com/watch?v=JRHAM1nAuD4
***************************************************
► For more information on math, science, and engineering majors, check us out at https://majorprep.com
***************************************************
► Patreon: https://patreon.com/majorprep
► PayPal: https://www.paypal.me/majorprep
► Facebook: https://facebook.com/majorprep

Forging a path into Computing

We spoke to Marwa Salayma about her journey from Palestine to studying for a PhD at EdinburghNapier.
Encouraged by her parents to pursue a career in computing, she discusses the challenges and encouragement she has encountered along the way and explains why it isn't just for men.
Find out more on our website:
www.napier.ac.uk/computing

10:58

Map of Computer Science

The field of computer science summarised. Learn more at this video's sponsor https://brill...

Map of Computer Science

The field of computer science summarised. Learn more at this video's sponsor https://brilliant.org/dos
Computer science is the subject that studies what computers can do and investigates the best ways you can solve the problems of the world with them. It is a huge field overlapping pure mathematics, engineering and many other scientific disciplines. In this video I summarise as much of the subject as I can and show how the areas are related to each other.
A couple of notes on this video:
1. Some people have commented that I should have included computer security alongside hacking, and I completely agree, that was an oversight on my part. Apologies to all the computer security professionals, and thanks for all the hard work!
2. I also failed to mention interpreters alongside compilers in the complier section. Again, I’m kicking myself because of course this is an important concept for people to hear about. Also the layers of languages being compiled to other languages is overly convoluted, in practice it is more simple than this. I guess I should have picked one simple example.
3. NP-complete problems are possible to solve, they just become very difficult to solve very quickly as they get bigger. When I said NP-complete and then "impossible to solve", I meant that the large NP-complete problems that industry is interested in solving were thought to be practically impossible to solve.
You can buy this poster here: https://www.redbubble.com/people/dominicwalliman/works/27929629-map-of-computer-science?p=poster&finish=semi_gloss&size=small
Get all my other posters here: https://www.redbubble.com/people/dominicwalliman
And free downloadable versions of this and the other posters here. If you want to print them out for educational purposes please do! https://www.flickr.com/photos/95869671@N08/
Thanks so much to my supporters on Patreon. If you enjoy my videos and would like to help me make more this is the best way and I appreciate it very much. https://www.patreon.com/domainofscience
I also write a series of children’s science books call ProfessorAstro Cat, these links are to the publisher, but they are available in all good bookshops around the world in 18 languages and counting:
Frontiers of Space (age 7+): http://nobrow.net/shop/professor-astro-cats-frontiers-of-space/
AtomicAdventure (age 7+): http://nobrow.net/shop/professor-astro-cats-atomic-adventure/
IntergalacticActivityBook (age 7+): http://nobrow.net/shop/professor-astro-cats-intergalactic-activity-book/
Solar System Book (age 3+, available in UK now, and rest of world in spring 2018): http://nobrow.net/shop/professor-astro-cats-solar-system/?
Solar System App: http://www.minilabstudios.com/apps/professor-astro-cats-solar-system/
And the new Professor Astro Cat App: https://itunes.apple.com/us/app/galactic-genius-with-astro-cat/id1212841840?mt=8
Find me on twitter, Instagram, and my website:
http://dominicwalliman.com
https://twitter.com/DominicWalliman
https://www.instagram.com/dominicwalliman
https://www.facebook.com/dominicwalliman

32:01

The Archive (Centre For Computing History) - Computerphile

A rare chance to look at the archives behind the Centre for Computing History (this is pro...

Computing Light Paths (ft. Wenzel Jakob)

Surprisingly enough, one of the most challenging parts of computer graphics is simulating light transport. In particular, mathematical tricks are required to allow for fast and realist imaging. The video features WenzelJakob, Assistant Professor of the IC School at EPFL.
https://people.epfl.ch/wenzel.jakob

9:05

How did the Apollo flight computers get men to the moon and back ?

There is much speculation by some, as to how the flight computer aboard the Apollo mission...

How did the Apollo flight computers get men to the moon and back ?

There is much speculation by some, as to how the flight computer aboard the Apollo missions managed to get men to the moon when it had just a tiny fraction of the computing power of something like a modern smartphone.
Patreon : https://www.patreon.com/curiousdroid
You can now translate this and other curious droid videos, see my video about it here https://www.youtube.com/watch?v=xLPVgIytKyg
But this is quite misleading as there was not one solitary computer controlling the Apollo craft, there were 4 computers and no fancy touch screens, GUI’s or other things in a typical computer of today to waste resources on.
The first of the 4 computers was the Saturn Launch Vehicle Digital Computer or LVDC, this got the rocket from the launch pad to earth orbit.
Then there was the Apollo Guidance Computer or AGC, this is the one which most people think of. There were in fact, two of them, one in the Command Module to get from the earth orbit to the moon and back again. The second was in the lunar module that would control the landing and then the ascent back to the command module and docking.
The fourth computer was one which was never used on any mission because it would control an emergency abort and ascent should something happen during the descent to the moons surface like the landing computer failing or they ran out of fuel.
The ApolloGuidanceComputer wasn’t as dumb as many make it out to be. As time when by in Apollo’s development, the tasks that it was meant to do increased in both number and sophistication, this in turn created ever more issues with the limited resources available .
One of the biggest problems was the limited amount of memory due to the technological limitations of the time, this meant that the programmers had to make use of every single byte available.
The AGC also had a unique operating system. Systems like UNIX, Linex, Windows and Apple iOS are in control and share time out to the programs. In the AGC, the programs controlled how much time they got depending on how important they were. So that in the case of an emergency, the highest priority programs would get most of the time and non-essential operations were dropped to free up resources which became the basis of mission critical system for all manned mission afterwards.
The computer had a performance somewhere around that to that of the first generation of personal computers like the Apple II, commodore 64, ZX Spectrum that would arrive 10 years later in the late 70’s.
It had 2k of RAM and 36k of fixed storage magnetic rope core memory, which was woven by hand and took months to assemble, so any software bugs were literally woven into the system.
A comparison between the Apollo Guidance Computer and say an iPhone 6 is tricky because the AGC was not a general purpose computer. It was built for a very specific task, had a unique operating system and with the 48-year gap in the technologies used, we can only really get very rough estimates.
The Apple iPhone 6 uses the ARM A8 processor which has about 1.6 billion transistors in it, the AGC had just 12,300. The iPhone 6 has 1Gb of RAM, about 488,000 times the AGC and in this one, 128Gb of non-volatile storage or about 3.5 million times the AGC.
As for performance, the iPhone 6 is somewhere between maybe 4 and 30 Million times faster than the AGC depending on what type of calculations are being done and if you include the iPhone’s GPU it would be even more.
So, if you had to fly back to the moon in an Apollo craft and given the choice, would you trust your life to a couple of iPhones in place of the AGC’s?, because you would actually have more computing power in just one of them than the whole of NASA had during all the Apollo missions.......
Title: Adam Are You Free?
Author: P CIIISource: www.pipechoir.com
Nightingale sounds from Gerry Gutteridge flic.kr/ps/Mk2zU
License: Creative Commons Attribution4.0

15:49

Feynman's Infinite Quantum Paths | Space Time

How to predict the path of a quantum particle. Part 3 in our Quantum Field Theory Series. ...

Accelerated Computing: The Path Forward

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing business unit, which includes all hardware and software product lines, third-party enablement, and marketing activities for GPU computing. Buck joined NVIDIA in 2004 and created CUDA, which remains the established leading platform for accelerated-based parallel computing. Before joining NVIDIA, he was the development lead on Brook, which was the forerunner to generalized computing on GPUs. Buck holds a Ph.D. in computer science from Stanford University and Bachelor of Science in computer science from Princeton University.
Learn more: http://nvidia.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

11:01

History of Computing - The Path to Modern Computers

Google IT Support Professional Certificate
https://www.coursera.org/specializations/googl...

History of Computing - The Path to Modern Computers

GoogleIT SupportProfessionalCertificate
https://www.coursera.org/specializations/google-it-support
This six-course certificate, developed exclusively by Google, includes innovative curriculum designed to prepare you for an entry-level role in IT support. A job in IT can mean in-person or remote help desk work, either in a small business or at a global company, like Google. Whether you’ve been tinkering with IT or are completely new to the field, you’ve come to the right place.
If you’re looking for a job, upon completion of the certificate, you can share your information with top employers, like Bank of America, Walmart, Sprint, GE Digital, PNCBank, Infosys, TEKsystems, UPMC, and, of course, Google.
Through a dynamic mix of video lectures, quizzes, and hand-on labs and widgets, this program will introduce you to troubleshooting and customer service, networking, operating systems, system administration, automation, and security.
Along the way, you’ll hear from Googlers with unique backgrounds and perspectives, whose own foundation in IT support served as a jumping off point for their careers. They’re excited to go on this journey with you, as you invest in your future by achieving an end-of-program certificate.
Course 1 - IT Technical Support Fundamentals
In this course, you’ll be introduced to the world of Information Technology, or IT. This course is the first of a series that aims to prepare you for a role as an entry-level IT Support Specialist. You’ll learn about the different facets of Information Technology, like computer hardware, the Internet, computer software, and job-related skills. You’ll also learn about the history of computers, and the pioneers who shaped the world of computing that we know today. This course covers a wide variety of topics in IT that are designed to give you an overview of what’s to come in this IT Support Professional Certificate. By the end of this course, you’ll be able to: - understand how the binary system works. - assemble a computer from scratch. - choose and install an operating system on a computer. - understand what the Internet is, how it works, and the impact it has in the modern world. - learn how applications are created and how they work under the hood of our computer. - utilize common problem-solving methodologies and soft skills in an Information Technology setting.
Who is this class for: This program is intended for beginners who are interested in developing the skills necessary to perform entry-level IT support. No pre-requisite knowledge is required. However, if you do have some familiarity with IT, you can skip through

3:21

Career Opportunities in Cloud Computing

This course describes 6 different positions in the cloud computing industry, all of which ...

Career Opportunities in Cloud Computing

This course describes 6 different positions in the cloud computing industry, all of which are in high demand as established companies move to cloud technologies, companies expand IT departments, and new companies form. There’s great opportunity for fulfilling work with great pay. Overall it’s a great time to be in the cloud computing industry (which is rapidly becoming the IT industry).
To enjoy the entire learning path: https://goo.gl/4VajyB
You can also have access to the entire CloudAcademy content library with a 7-day free trial at https://goo.gl/8lu43D.

Transform yourself and build your IT cloud career path

Want to evolve your career to a role in cloud technologies? Learn how to transition from on-premises IT to cloud computing, what technology skills you will need as well as, the soft skills required to succeed. Receive practical advice on mapping out your career journey including industry input and success stories from IT folks just like you who have transitioned. We explore resources available to help you on this career journey such as, the IT ProCareer Center, IT Pro CloudEssentials, community and more. https://ignite.microsoft.com/

This presentation was recorded at GOTOAarhus2013http://gotocon.com
Jeff Hawkins - BrainInspired Computing
ABSTRACT
Understanding how the brain works and building machines that work on the same principles is one of the greatest quests of our time. In this talk I will describe recent advances in neocortical theory, including why the brain uses sparse distributed representations and how the brain makes predictions from high velocity sensory data streams.
I will demonstrate a product called Grok, that uses a detailed model of neocortical memory to act on machine generated data and how developers can contribute to the development of intelligent machines via the NuPIC open source project (www.numenta.org).
https://twitter.com/gotocon
https://www.facebook.com/GOTOConference
http://gotocon.com

22:47

Accelerated Computing: The Path Forward

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The Path Fo...

Accelerated Computing: The Path Forward

In this video from SC17, Ian Buck from NVIDIA presents: Accelerated Computing: The PathForward.
Ian Buck is vice president of NVIDIA's Accelerated Computing business unit, which includes all hardware and software product lines, third-party enablement, and marketing activities for GPU computing. Buck joined NVIDIA in 2004 and created CUDA, which remains the established leading platform for accelerated-based parallel computing. Before joining NVIDIA, he was the development lead on Brook, which was the forerunner to generalized computing on GPUs. Buck holds a Ph.D. in computer science from Stanford University and Bachelor of Science in computer science from Princeton University.
Learn more: http://nvidia.com
Sign up for our insideHPC Newsletter: http://insidehpc.com/newsletter

Computing Beyond Turing - Jeff Hawkins

Coaxing computers to perform basic acts of perception and robotics, let alone high-level thought, has been difficult. No existing computer can recognize pictures, understand language, or navigate through a cluttered room with anywhere near the facility of a child. Hawkins and his colleagues have developed a model of how the neocortex performs these and other tasks. The theory, call Hierarchical Temporal Memory, explains how the hierarchical structure of the neocortex builds a model of its world and uses this model for inference and prediction. To turn this theory into a useful technology, Hawkins has created a company called Numenta. In this talk, Hawkins will describe the theory, its biological basis, and a software platform created by Numenta that allows anyone to apply this theory to a variety of problems. Part of this theory was described in Hawkins' 2004 book, "On Intelligence".
This talk is by the Chairman of the Redwood Neuroscience Institute and co-founder of Palm Computing and Handspring, and is co-sponsored by Calit2 at UCSD, the Jacobs School's Computer Science and Engineering (CSE)department, and the Institute for Neural Computation (INC).

32:01

The Archive (Centre For Computing History) - Computerphile

A rare chance to look at the archives behind the Centre for Computing History (this is pro...

Password Cracking - Computerphile

'Beast' cracks billions of passwords a second, Dr MikePound demonstrates why you should probably change your passwords...Please note,at one point during the video Mike suggests using SHA512. Please check whatever the recommended process is at the time you view the video.
How NOT to Store Passwords: https://youtu.be/8ZtInClXe1Q
PasswordChoice: https://youtu.be/3NjQ9b3pgIg
Deep Learning: https://youtu.be/l42lr8AlrHk
CookieStealing: https://youtu.be/T1QEs3mdJoc
http://www.facebook.com/computerphile
https://twitter.com/computer_phile
This video was filmed and edited by Sean Riley.
Computer Science at the University of Nottingham: http://bit.ly/nottscomputer
Computerphile is a sister project to Brady Haran's Numberphile. More at http://www.bradyharan.com

2016 Isaac Asimov Memorial Debate: Is the Universe a Simulation?

What may have started as a science fiction speculation—that perhaps the universe as we know it is a computer simulation—has become a serious line of theoretical and experimental investigation among physicists, astrophysicists, and philosophers.
Neil deGrasse Tyson, Frederick P. RoseDirector of the Hayden Planetarium, hosts and moderates a panel of experts in a lively discussion about the merits and shortcomings of this provocative and revolutionary idea. The 17th annual Isaac AsimovMemorial Debate took place at The American Museum of Natural History on April 5, 2016.
2016 Asimov Panelists:
David ChalmersProfessor of philosophy, New York University
Zohreh Davoudi
Theoretical physicist, Massachusetts Institute of TechnologyJamesGates
Theoretical physicist, University of MarylandLisa Randall
Theoretical physicist, Harvard UniversityMax Tegmark
Cosmologist, Massachusetts Institute of Technology
The late Dr. Isaac Asimov, one of the most prolific and influential authors of our time, was a dear friend and supporter of the American Museum of Natural History. In his memory, the Hayden Planetarium is honored to host the annual Isaac Asimov Memorial Debate — generously endowed by relatives, friends, and admirers of Isaac Asimov and his work — bringing the finest minds in the world to the Museum each year to debate pressing questions on the frontier of scientific discovery. Proceeds from ticket sales of the Isaac Asimov Memorial Debates benefit the scientific and educational programs of the Hayden Planetarium.
2017 Isaac Asimov Memorial Debate: De-Extinction
https://www.youtube.com/watch?v=_LnAtMeSVeY&index=1&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2016 Isaac Asimov Memorial Debate: Is the Universe a Simulation?
https://www.youtube.com/watch?v=wgSZA3NPpBs&index=2&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2015 Isaac Asimov Memorial Debate: Water, Water
https://www.youtube.com/watch?v=FSF79uS3t04&index=3&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2014 Isaac Asimov Memorial Debate: Selling Space
https://www.youtube.com/watch?v=GbmFeEIKBFI&index=4&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2013 Isaac Asimov Memorial Debate: The Existence of Nothing
https://www.youtube.com/watch?v=1OLz6uUuMp8&index=5&t=25s&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ
2012 Isaac Asimov Memorial Debate: Faster Than the Speed of Light
https://www.youtube.com/watch?v=5qlLW60wOjo&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ&index=6
2011 Isaac Asimov Memorial Debate: The Theory of Everything
https://www.youtube.com/watch?v=Eb8_3BUHcuw&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ&index=7
Rose CenterAnniversary Isaac Asimov Debate: Is EarthUnique?
https://www.youtube.com/watch?v=9Ji_GdAk9vU&index=8&list=PLrfcruGtplwGKzxDI_Ne06NlpOKt-yonZ&t=25s
***
Subscribe to our channel:
http://www.youtube.com/subscription_c...Check out our full video catalog:
http://www.youtube.com/user/AMNHorg
Facebook: http://fb.com/naturalhistory
Twitter: http://twitter.com/amnh
Tumblr: http://amnhnyc.tumblr.com/
Instagram: http://instagram.com/amnh

Stanford Seminar - Multiscale Dataflow Computing: ...

Ray Kurzweil - The Path to The Singularity...

Gizmodo reported on Wednesday that a former Google engineer is suing the company for discrimination, harassment, retaliation, and wrongful termination ...Chevalier's posts had been quoting in Damore's lawsuit against Google, who is also suing the company for alleged discrimination against conservative white men ... “Firing the employee who pushed back against the bullies was exactly the wrong step to take.” ... But the effect is the same....

OSLO. Sea levels will rise between 0.7 and 1.2 metres in the next two centuries even if governments end the fossil fuel era as promised under the Paris climate agreement, scientists said on Tuesday ...Ocean levels will rise inexorably because heat-trapping industrial gases already em­­itted will linger in the atmosphere, melting more ice, it said. In addition, water naturally expands as it warms above four degrees Celsius (39.2F) ... ....

The woman tasked with caring for accused Florida shooter Nikolas Cruz and his brother have moved quickly to file court papers seeking control of their inheritance the day after the massacre at Majory Stoneman Douglas High School, Newsweek reported. When the mother of Nikolas and Zachary Cruz died from flu-related pneumonia last November, their lives were entrusted to Roxanne Deschamps, the report said....

Special CounselRobert Mueller's probe is prepared to accept a guilty plea from the London-based son-in-law of a Russian businessman after he made false statements during the investigation into alleged Russian interference in the 2016 U.S. presidential election, according to the Washington Post... Tymoshenko was later imprisoned by former president Viktor Yanukovych after signing a controversial deal with Russia for natural gas ... U.S ... U.S....

Article by WN.Com Correspondent Dallas DarlingTo this day it’s something my aunt hardly mentions, let alone discusses. And like a few other families living in the United States, it’s taboo and completely off limits ... Neither was it as widespread, since Japan had nearly conquered most of East Asia including parts of China. But still, U.S ... authorities continued the comfort station system absent formal slavery ... The U.S ... military authorities ... ....

search tools

You can search using any combination of the items listed below.

ShahbazAnsari appeared for his exam on a stretcher laid out at the exam centre ... And, education will help me find that path. I plan to pursue a computer course that will allow me to work from home and help my mother live a better life," said Shahbaz, who has only ever been schooled at home with the help of tuition teachers ... Thane ... The NGO has also started looking for computer courses that will help Shahbaz earn a livelihood from home ... ....

JakeEaster, who often rides past the UNC School of the Arts on his biking commute, said he often hits downhill speeds of around 35 miles an hour, but worries that someone getting out of a parked car could open a door into his path... “I enjoy having an input into the process,” she said, adding that increasing the connectivity of biking paths is important as well....

These, it seems, need ‘deep neural processing’, ‘convolutional neural processing’ and ‘computing in memory’ ... And computing in memory means storing data in RAM rather than disc. And what is a neural network? It can be almost anything you like which computes ... The excitement is that neural nets promise the ultimate’ computer that can think, see, hear, smell, taste and feel....

Colorado Department of Transportation employees resorted to pen and paper on Wednesday after nasty ransomware hijacked computer files and demanded payment in bitcoin for their safe return. Security officials didn’t flinch and shut down more than 2,000 employee computers while they investigated the attack ... Only employee computers — running Windows and equipped with McAfee security software — were impacted....

Scientists have been looking for ways to make computers act as neuromorphic or brain-like for years ... Hersam explained that computers have separate processing and memory storage units, unlike the brain which ......

The next computer-generated animals in King Kong or The Lion King could look a lot more realistic thanks to a breakthrough by computer scientists. The researchers developed a method that dramatically improves the way computers simulate fur, and more specifically, the way light bounces within an animal's pelt. <!-- more --> ... ....

In the blockchain world, Ethereum, NEO, EOS and other blockchain-based computing platforms exploit land to run applications ... The operating system as the interface of the user and the computer manages the scheduling, operation, and processing of the computer hardware resource ... Bitcoinist. ....

elevators, lights, computers, refrigerators ... The CEA study provides many examples of computer breaches ... In 2014, hackers penetrated Home Depot's computers through the network of a supplier, compromising more than 50 million accounts ... Just how much of a burden computer crime now imposes on the United States is hard to know....

The cloud provider launched the Serverless Application Repository out of beta, which provides customers with an app store to try out different applications built using Lambda, its event-driven computing service ... It allows developers to write short computing functions that run in response to trigger events, without managing the underlying computing infrastructure used to run those functions ... ....

REXBURG—Doug Ricks announced on Thursday, Feb ... He later graduated from Ricks College and opened Rexburg's first computer store, according to Rick's campaign website ... I taught myself everything I needed to know about computers to better serve my customers.". Ricks is currently a tech administrator at Brigham Young University-Idaho, where he manages the computers in the university's computer labs ... ....

ordinary cameras, open-source software and low-cost computer chips ... That makes them vulnerable to manipulation; today’s computer vision algorithms, for example, can be fooled into seeing things that are not there ... The rapid evolution of AI is creating new security holes. If a computer-vision system can be fooled into seeing things that are not there, for example, miscreants can circumvent security cameras or compromise a driverless car....