Menu

Month: March 2014

I’ve just finished testing my foundation neural network function with openCL C and can confirm that it works! From what you can see above, I have my run through it’s selection process for batches of 7000 individuals each (due to scalability constraints), selects the best one and then will have a “battle royale” of the top performers from each batch.

The results you see are based on my training data of the XOR table (IE: input 0, 0 = 0 ; 0,1 =1 ; 1,0 = 1 ; 1,1 = 0) and the result for 0,0 is displayed to compare to its fitness data.

I’m about half way finishing converting the extremely obtuse and unwieldy code into C++ class based structures so that myself and others can use it, however I’m not entirely sure if I still want to make my first practice project trading in bitcoins after my exchange of choice (MTgox) decided to fold last month, so if anyone has any good ideas I’m all ears!

I’ve also been in an interview process for a really amazing job! I have a very real possibility that I might be a PLM (product lifecycle management) engineering consultant for a really neat tech startup that an internet friend of mine referred me for. As you could imagine after looking for work for 6 months its a breath of fresh air to finally find someone who values your skills, knowledge, and determination!

Let me know of any projects that could use NNs you’d like me to talk about! As you can tell I’m pretty enthusiastic about the topic.

Material selection can at times be very tricky, sometimes you want certain aspects of a material such as high strength but you don’t want the low ductility that generally trends along with it.
(for information on the techniques discussed in this article please look at the wiki links below)

To solve this problem alloys and composite materials are developed by materials scientists and engineers that have particular characteristics that are well suited for a particular application; however this process is very time consuming an could take months if not years to completely map the strengths and weaknesses of a new alloy or composite design.

There are ways around this time sink, and one of them is instead of empirically measuring the resultant properties of a material for each change of a materials chemistry or micro-structure, it may be possible to predict the resultant structure based on non-destructive analysis methods and neural networks.

The first step with any neural network is to prepare the training data for the system to adapt and successfully understand the underlying natural laws the govern the kinetics. Input data would be generated from the resultant Electrochemical Impedance Spectroscopy (EIS) and/or Ultrasonic Testing (UT) of a sample training material, and the output data would come from slicing the specimen into very thin (~10 micron) plates which would then be analysed with EDS to determine the real material parameters.

The data would then be fed into a neural network designed to handle 3D input parameters (IE: It would have to have at least 2 hidden neuron layers), and a trained neural network will be generated. This network will now be able to take NDT data from a new material and generate a 3-D Finite Element Model of what it predicts to be the composition and micro-structure.

Imagine for a second, that you had the power to simulate atoms and molecules and how they interact with each other to create a material, sounds like it would make material selection pretty easy right?

Unfortunately the technology doesn’t yet exist to model so many individual elements and FEMAP (Finite Element Modelling and Processing) can only do so much; it still requires something to interpret the resultant data.

Now imagine if you were able to teach a computer to look for the “natural laws” that govern the mechanical/chemical/electronic properties of a material based on its alloy (or composite) content.

This idea brings us to machine learning and neural nets which are capable of discovering patterns within current materials, and use those results as “training data” to try and predict the material/chemical/and electronic properties of a yet-to-be tested material.

The length of time it took us to completely comprehend steel-making took us almost a millennia, imagine if we were able to cut that time down to a few days via material simulations and predictive modelling!

It won’t take long before we’re able to build space elevators and room temperature super-conductors when this technology comes of age.

“You can be sure that when you have come up with a great idea, at least 10 other people in the world came up independently with the same great idea. The difference will be who is going to make it happen first”.

Building your start-up infrastructure can be a daunting task. Here is a list of proven, free, and easy to deploy tools that can help you save time and money.

Share this:

Like this:

There was an idea I had last night relating to neural nets that I thought was really cool. While watching a documentary on global economics being “too complex” to manage for any person to understand or comprehend, which made me immediately think of automated neural networks.

I believe our solution to these global financial meltdowns, are sovereign wealth funds like those used in the Nordic economies, however they aren’t perfect yet.

Imagine just for a second, the Nordic economic nations (Denmark, Holland, Etc.) that have sovereign wealth funds and how they’re currently managed.
These funds have fund managers which are people, and as people they’re fallible and corruptible. In my opinion, people should not be entrusted with such an important infrastructure. The reason we as a global economy currently go into booms – bust trends (and are affected by the fallout), is due to individual fallibility. My current neural network framework could quite possibly enable nations to trade their commodities and stocks nationally and internationally at very close to near the real value of the commodity or stock which would make speculation bubbles essentially impossible.

This economic policy change would slowly grow, nation by nation. Each nation that enabled such an infrastructure would able to communicate directly with other national smart agents, and this process would continue due to the obvious extreme interest in global economic stability.

At this point I hope you can now imagine a world where speculators, traders, and hedge fund owners would very rapidly become obsolete professions. Obviously when you upset the current power structure at their bank accounts, it’ll start a war, and I’m not quite sure how it would be avoidable.

I would love to hear your opinions on the matter, we’re rapidly approaching a time when this will not only be feasible (which it is today), but it will be nearly mandatory as the stability it would give our global economy would be a much needed benchmark.

What is the definition of Machine learning? The wikipedia definition of the term is as follows:

machine learning

Machine learning, a branch of artificial intelligence, concerns the construction and study of systems that can learn from data.

What does that mean? How does it work? this article is going to go over both of these fundamental questions and talk specifically about Artificial Neural Networks (ANNs) and why they are such a powerful tool in the hands of the savvy developer.

Imagine you have a “smart key” lock on your door, these locking mechanisms have a special feature that allows them to reset the tumblers and form fit to the teeth of a new key. This new key is now the only key that will open the lock, as the lock now “knows” what key is valid and what isn’t, and it was taught to acknowledge the current key as the only valid key, and a key with even a slight variation of teeth would be considered invalid.

What I just described was a layman’s view of what a neural network does, and how machine learning works. a basic Artificial Neural Network functions by taking a set of input parameters (they could be any variable that could even remotely impact the output value), putting them through a series of plastic ( malleable) coefficients, which are then plugged into an activation function which spits out what it thinks to be an output value.

Y(O1, O2) = A(1*C1*A(1*C2+I1*C3+I2*C4+….+ In*Cn+2))

The power of this system comes in the randomization of the weight choices. Weights are initially randomized and then by a multitude of potential methods are refined to produce less and less error per iteration (The specific reduction methods will be discussed in further detail in another talk.)

As the system reduces error iteratively, it will eventually reach a point where the error between the expected result of the dataset and the real value are nearly indistinguishable, however the network is not yet verified.

The next step is to plug the weight values into a verification network (IE the weights are fixed, and the input and ouput values iterate through the verification data). If the network was sufficiently trained, then it will pass the verification stage and the designer will able to acknowledge the strengths and limitations of this particular net, and use that to forecast with reasonable certainty of its results.

A network like I just described could be as varied as Google’s house number optical character recognition software (http://phys.org/news/2014-01-google-team-neural-network-approach.html), a “suspicious person” face recognition software for law enforcement and transit authorities, or a smart commodities and stock trader that is able to see past speculation and find the true value of a particular commodity, and communicate with other smart traders to determine the direction of the global economy.

Now there are drawbacks of course; The computational power required to do any of these individual tasks is significant, and the training dataset must be sufficiently large, varied and transparent enough so that the system can learn the intricacies and fundamentals of how each system obeys underlying natural laws.

However with the advent of “big data”, obtaining such information is becoming quite straightforward albiet sometimes expensive, and with more powerful computing methods such as GPGPU (general purpose graphics processing unit) computing via openCL, the power of neural networks and smart agents is just starting to be realized.

I’m a little new at this, however I think its around time that I started to develop this page. I’ve completed a lot and there is a significant amount of work left for me to do, however let’s start at introductions before we delve into the objective of this blog.

I’m a spry 23 year old who has recently completed his materials engineering degree from Dalhousie University (whew!), and I’ve been working for my professor as a research assistant and loving it. Unfortunately that will soon be coming to an end on April when the grant money runs out, and I’ll be left without any form of income or further academic job prospects. I also happen to be taking care of my 76 year old father who I love dearly and is relying on me for financial support, which at the very least I owe him for all of the hardship we’ve fought through together over the years, including my mother passing way in 2001.

Which brings us to my job search – I’ve been applying for jobs locally and internationally; writing custom cover letters and resumes (like the one below) to find positions in companies that would be interested in materials engineers. however I’ve been essentially screaming “hire me!” into the void and expecting a result, absolutely not a single person wants to hire me. I’ve applied for over 300 engineering positions on linkedin, kijiji, careerbeacon, even in person, no one wants a materials engineer.

So as you can imagine, when you’re 23 and taking care of your ailing father who’s had prostate cancer for as long as you can remember, and both of you are rapidly running out of money, all of a sudden spending all of my waking hours job hunting doesn’t seem like a valuable use of my only valuable resource, my time.

At this point all I wanted was a semi-permanent job, I wasn’t looking for the holy grail and even at the time of writing this if someone offered me ~$40k a year to paint the mona lisa upside down and with my toes I would try my hardest to make sure he got his monies worth, however in my city it seemed as if all work suddenly dried up upon graduating, so I had to try something else which brings me to neural networks.

On the Bitcoin forums (Been an active Bitcoin advocate for the past couple of years, but sold too early har har!) someone posted about Artifical Neural Networks (ANNs) and machine learning systems designed to act as smart commodity agents, and linked to Jeff Heaton’s videos (http://www.youtube.com/channel/UCR1-GEpyOPzT2AO4D_eifdw) on neural network design and fundamental structures. I immediately realized the value of such a program and went to my university library to find any books I could on the topic, and begun studying.

Initially it was hard, I wasn’t used to working at such a pace. However with time, I’ve become used to the frantic “burning the candle on both ends” level of work, and I’ve transformed from a somewhat lazy engineering graduate to one of the hardest workers I know.

That was on February 1st, and it’s now March 6th, and a lot has changed. I’ve successfully completed a working prototype in OOP (Object Oriented Programming) C++ on my CPU, and I’m very nearly finished my openCL C version of my prototype, of which I call (creatively) “Foundation”.

My ideas aren’t completely revolutionary, however with time I hope to completely retool myself to become a very good engineer and machine learning scientist hybrid, and maybe landing a job I could be proud of.

Thanks for sticking with me on this rant, and I really do want to hear what your thoughts on what I’m doing, I’ll be updating this blog hopefully bi-weekly so there should be some interesting case studies next week on what I’ve learned about openCL and neural network design.