The personal view on the IT world of Johan Louwers, specially focusing on Oracle technology, Linux and UNIX technology, programming languages and all kinds of nice and cool things happening in the IT world.

Wednesday, July 31, 2013

Companies are trying to define big data in a way this is understandable for business users. We have seen some models in which the 4V model is used to explain this. Oracle has defined the 4 V's for big data as Volume, Velocity, Variety and Value. Recently IBM has released his own version of a 4V model where they state the 4 V's have a different meaning. In the IBM model it is stated as Volume, Velocity, Variety and Veracity. It is interesting to see that both companies use a different meaning of the 4 V's however both use it to give angles to the concept of big data. Downside of this is that it is sending a rather confusing message into the market as two players have both the same wording however use if slightly different and give it a different meaning.

Below we can see the Oracle representation of the 4V model for Big Data:

And here we have the IBM meaning of the 4V model for Big Data:

Both are true and do carry a good message however somehow it is confusing and both do not carry the full message. By having both a value of truth in them I would rather promote a 5V model which, potentially could be represented as shown below.

Monday, July 22, 2013

Kate Crawford is talking about the six myths of big-data at the DataEDGE2013 conference at Berkeley. Kate Crawford is a principal researcher at Microsoft Research, a visiting professor at the MIT Center for Civic Media, and a senior fellow at the Information Law Institute at NYU. Over the last ten years she has researched the social, political, and cultural contexts of networked technologies. Her current work focuses on a range of data practices, from the ethics of big data, crisis informatics, networked journalism, and the everyday uses of mobile and social media. She has conducted large and small-scale ethnographic studies in Australia, India, and the US.

Sunday, July 21, 2013

Within the Oracle Fusion Application stack we have an application called Oracle Fusion Expenses. Oracle Fusion Expenses, a component of Oracle Fusion Financials, is a robust travel and expense solution that automates the management of travel spend and establishes policy-driven controls for expense reimbursement. In general most of the expenses made by people are when they are on the go. When traveling, taking a coffee on an airport, taking a taxi while traveling to a meeting or booking a hotel room when somewhere at the go. The moments you do not have the time to open your laptop and start typing your expenses into an application. Meaning that a mobiel application is one of the ideal solutions to add to a expenses application.

Ultan O'Broin is running a nice article on Oracle.com on the user experience of such an application. In the below demo video you can see how such an mobile application can be used. We have already been discussing building mobile application with Oracle ADF before on this weblog in combination with SOA. A cool feature of this user experience design pattern is the use of voice. Specially for such a mobile application is speech often a great solution to enter data instead of typing all the data into your application. At a later stage you can check and consolidate your expenses and prepare them to be paid to you.

Some post ago I mentioned the the Lockitron project that was building a addition that can be placed on a doorlock to enable you to control it via WiFi. Already I stated that other vendors are developing solutions like this and that most of them require you to completely replace the lock. Even though that statement still holds there are some projects that deserve some attention because they build a great product that enables you to control your locks with a mobile device. Two that come to mind are the August Smart Lock and the Goij Smart lock. Both are amazing cool products and both are build by a startup.

The August Smart Lock is a bit more "complicated" then the Lockitron project as this requires you to replace your lock where the Lockitron is something you place over your existing lock.

It looks like a trend to create a cool video to promote the products of a startup and both August and Goij are not an exception to this.

Here you can see the video from August:

And we have a video from Goij

Goij is adding some nice features which are not seen with August or with Lockitron in the form of messages you can display on your lock. Again with both products, it is not clear if there will be an API available. In my opinion providing an API with products like this should be always included. This will enable developers with options to build upon cool products and ensure it is more included within other products.

As we see more and more connected devices coming to the market and all have there own mobile application this makes that you will have a large amount of applications on your mobile device, one to control your lock, one to control you music, one to control your shades, one to control,....... in a more ideal setup you have one application to control all you connected devices in your house to ensure you are not overloaded with different applications on your mobile device.

Saturday, July 20, 2013

For some reason curtains are something that troubles a lot of people. It troubles some people because the have to pick them and some people it troubles because they have to make sure that hey are installed in front of the windows. Even though I am not a fan of curtains I can see the role they play in your house and that they provide some form of privacy and shielding the sun from time to time. However curtains are already taking a place in houses for hundreds of years and we have seen some "mild" improvements in them nothing to fancy has happend to them. That is at least in my opinion while a curtain specialist might differ from opinions with me.

However, now a startup company is providing a en way of thinking about curtains that is exciting me. A company called Sonte has produced a film which can be applied to your windows and can shield them and make them less transparant. The level of transparency can be adjusted to the situation and can even be controlled from a mobile device.

Sonte has started a kickstarter project to raise funding however for some, currently unknown, reason stopped the fundraising. Hopefully this is due to the fact they found one single company that is willing to invest into the Sonte window film.

What makes the idea from Sonte so exciting is that you will be able to controle your windows via a mobile app. This also should potentially give you the options to connect the Sonte window film to your home automation and add this to a more general way of controlling your house. Think about options where you can blind your house from remote, have it scheduled or make it reliant on the heat inside your house in combination with the location of the sun and the level of sunlight that is hitting your windows. We only have to hope that Sonte will include an API to its product that people will be able to build hooks into home automation solutions to control this from a more central console then only the Sonte applications.

Friday, July 19, 2013

Not a new project or a new product for that matter. WiFi enabled doorlocks are already out there on the market for some time. However, Lockitron is having something cool and nice to it. Lockitron is growd funded project for the development of a wifi enabled lock, or more so, a wifi enabled part you can add to your already existing lock and helps you open and close it via your mobile phone or via a webportal. Next to this Lockitron is using bluetooth as a "near field communication" methode to open the door for you when you are close to the door.

Already other companies are producing WiFi enabled locks, take for example lockstate, the difference however with Lockitron is that lockstate is selling a full lock where Lockitron is providing a piece which you can add to your already existing lock.

Now the main question is if you want to put a WiFi enabled lock on your front door. There might be some security issues with that and I am not sure what insurance companies state about using such a lock. However, for a true tech lover I can imagine you want to play with such a solution and as it is not requiring you to change your lock it is a good way to start experimenting with such a locking device.

Fun thing is that when you start working with network enabled locks you can potentially add this to your full home automation solution. There are a lot of solutions in place to automate parts of your house, adding locks to this will in the future be something almost every house will have. This will also make that your phone becomes more and more important. We are already starting to pay with your phone, start the music in your house, controle you lighting and heating..... and within some time opening your doors with your phone will become very normal in my opinion.

keyless security,..... will Lockitron and the likes be the companies who are at the forefront of this new wave of connected things. I think doors will get a very well respected place in the internet of things. What I am missing at Lockitron, or I did not see it, is an API to add this to other services or get reports of your lock usage out of it. This would enable developers to create nice and cool new features for Lockitron where the current people of Lockitron might not even have thought about.

Wednesday, July 17, 2013

The internet has always been focussing around people, enabling people to find information on the internet was one of the first usages of the public internet next to sharing information via mail and newsgroups. The second phase was, also called WEB2.0, the sharing culture. The sharing culture is influenced by the uprising of social networks. In my opinion WEB2.0 is not about new technologies it is more about the way people think and interact with the internet and the social media that is available via the internet. Now we see a new trend coming, this trend is almost completely technology driven, it is the internet of things. Internet of things is about machines using the internet and especially using the network infrastructure of the internet.

With the internet of things phase of the internet you will see more and more equipment being connected to the internet. Your phone is quite obvious however also your music installation and your television are already connected to the internet most likely. Now your fridge, your microwave, your doorlocks, your scale and more will be connected to the internet. Some of those connections will be used to help you organize your life and make things easier. Some of those things will also help vendors create better products.

If we take for example the washing machine, your washing machine can be connected to the internet and send a tweet or other form of message to you to inform you that it is done and you can get your fresh clothing out of it. However, it can also server a purpose for the engineers building washing machine. For an engineering team it is very valuable to know how many times you use it a week, what the average load is, what program you are using. Next to this other things might be interesting to know, what is the average load put on top of a washing machine, what is the average temperature, air pressure and humidity in the room where the machine is standing. All this information can help an engineering team to build a better machine and spot possible solutions for a specific market segment based upon a geographical or demographic profile.

Building a board and connecting all sensors to measure those things is not the biggest hurdle, even if you are quite an amateur in electronics you will be able to build a device like that with ease with for example a RaspBerry Pi and some out of the box sensors from adafruit.com . Connecting it via the home wifi to the internet and have it broadcast valuable information back is also not the big issue. The main issue will be around two things.

Consider you are a large enterprise and thinking about shipping millions of your products and collecting data from that, your first issue will be storing this information in a certain way. The second issue is retracing this information and making it usable for future analysis. Future analysis can be in this case for multiple departments within your company, for example the marketing department, engineering department and your warranty and claims department.

What is interesting to me is especially the gathering and storing part. To be able to handle a hight flow of data and take correct actions on this and store this in a good way in your datastore you will have to have some sort of mechanism in place. First solution that comes to mind is writing the data to a large file in a line by line manner on a HDFS filesystem and then have it chunked into the correct format by a MapReduce algorithm. What however can be a great alternative before you write your data to HDFS is to use Oracle CEP to already look into the data and take action. Oracle CEP, or Complex Event Processing, formally known as WebLogic Event Server).

Oracle CEP is a Java server for the development and deployment of high-performance event driven applications. It is a lightweight Java application container based on Equinox OSGi, with shared services, including the Oracle CEP Service Engine, which provides a rich, declarative environment based on Oracle Continuous Query Language (Oracle CQL) - a query language based on SQL with added constructs that support streaming data - to improve the efficiency and effectiveness of managing business operations. Oracle CEP supports ultra-high throughput and microsecond latency using JRockit Real Time and provides Oracle CEP Visualizer and Oracle CEP IDE for Eclipse developer tooling for a complete real time end-to-end Java Event-Driven Architecture (EDA) development platform.

The below image is showing a deployment which is using Oracle CEP among with some other parts.

As you can see in the above image the washing machines are reporting the sensor readings to the Oracle CEP Stream Adapter. The Stream Adapter will in turn "stream" this into the CQL processor. What the CQL processor is doing in this setup is monitoring all incoming sensor readings. Most of the results will be passed directly to the HDFS storage where it will later be used by Hadoop to chop it into usable parts and fill the databases of the different departments. The big differentiator in this setup is however that by using the CEP approach is that when the CQL processor is detecting a fault in the washing machine, for example a broken part, it can call a Java Event Bean and in return the Java Event Bean can send a message to the ERP system that a repair is needed for a specific machine. If we take for example Oracle E-Business Suite we could send a trigger to Oracle E-Business Suite to spawn a service task and assign a specific service engineer to it. The service department then will inform the customer that something is broken and will, by using the correct modules in Oracle E-Business suite, plan the repair.

When the service engineer is done the machine can report, via the same way as all the sensor data is send, that the machine is correctly operating again and the service request can be closed via an automated approach.

This example is showing that by using machine-2-machine communication and by suing different technologies a company building products and a customer can benefit from reporting information directly. By implementing a Hadoop way of handling the enormous amounts of sensor reading we can ensure that every department is getting the data they need in there data-warehouse and are not overloaded with information. The above example is a very simple and very high level example, building a solution like this in the real-world is still a challenge and will need a large number of different skills however it can help a company and customers in many ways.

Saturday, July 06, 2013

Andy Cameron, the Capgemini head of Business Information Managent is talking in this video about big-data and also about the whole concept of data and data quality. Data is not always what people think it is, a bunch of numbers stored in a database. Data, when used in a correct way can mean the difference between success for your business or failure when interpreted in the wrong way. Also collecting data and storing this in other ways then companies are used to currently brings not only advantages. it also brings responsibility to ensure this data is secured and handled in the correct way with respect to the privacy of employees and customers.

We currently see an explosion in big-data and business intelligence requests and projects within companies. This is one of the reasons that companies will need more and more data scientists and mathematicians in the future then they need currently. A couple of years ago the term / role of data scientists was virtually none existing, today you see more and more demand for this role within enterprise size companies. Below is a infographic made by EMC2 which shows the future of the data scientist role.

The explosion in digital data, bandwidth, and processing power – combined with new tools for analyzing the data – has sparked massive interest in the field of data science. Organizations of all sizes are turning to people who are capable of translating this trove of data – created by mobile sensors, social media, surveillance, medical imaging, smart grids, and the like – into predictive insights that lead to business value. Despite the growing opportunity, demand for data scientists is outpacing the supply of talent and will do so for the next five years.

Thursday, July 04, 2013

When working with an Oracle Exadata appliance you normally do not have to come close to the machine itself as with most servers. However in some cases a disk will be faulty and is in need of replacement. The Oracle exadata, depending on what "size" you have, will have a number of storage nodes available in the rack. Each node will have a number of disks in it. When you need to replace a broken disk with a new disk it can be quite handy to ensure you are removing the correct faulty disk and not a disk which is perfectly fine. To help the people who will do the physical swapping of the disk there is a Service LED located on each disk which you can turn on or off. This will help the engineer to locate the correct disk without having the need to count disks and nodes.

To turn on, or to turn off, the Service LED on a disk you have to make use of the CellCLI. For example we are in need of turning the Sevice LED on for 3 disks we will use the following command:

Tuesday, July 02, 2013

Cloud computing comes in many forms, in some cases cloud computing is “just” another form of hosting and it is considered an IaaS (Infrastructure As A Service). In this cloud computing model some of your servers / systems will be located within the cloud of a cloud vendor. Amazon is a good example of this. By having your servers in one or more cloud and some (or none) located in your traditional datacenter you start creating a hybrid cloud model.

Having a hybrid cloud model provides you the options to make sue of the best of breed hosting options. This can be a big advantage however in some cases also brings a challenge. Even though everything can be hosted somewhere in a cloud you most likely would like to have a unified monitoring in place which gives you a holistic view of all your servers and services.

Monitoring capabilities, which provide you a holistic overview of your Enterprise IT assets are provided for Oracle products and none Oracle products by Oracle Enterprise Manager. Oracle Enterprise Manager provides you with options to monitor hardware and software within your Oracle landscape and also maintain it from the same console.

Oracle is providing a great, out of the box, solution when monitoring your on premise IT assets and it can even monitor in multiple datacenters when you have a dual or triple datacenter setup. However, in some cases you have servers running within the Amazon Web Service (AWS) hosting cloud. This is a trend that is seen more and more within the corporate world. Even though some of your servers are running at AWS you still want to include them in your default monitoring tool, Oracle Enterprise Manager and be able to monitor the complete hybrid cloud setup.

In the above example you can see how we leverage an already available tunnel between the enterprise datacenter and AWS to ensure that the OEM connection is secured and encrypted on a network layer. Connecting the datacenter and AWD in such a manner is common practice and when connecting Oracle Enterprise Manager to the servers in AWS you can leverage this tunnel to do so.

Oracle is providing a plugin for this in the Oracle Enterprise Manager extensibility exchange. This plugin is developed by Oracle to monitor the AWS services and by doing so provide you a single monitoring console.

Monday, July 01, 2013

Hadoop, the Apache opensource project is getting support from a unexpected direction. Microsoft is starting to support the opensource project. The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-availability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-available service on top of a cluster of computers, each of which may be prone to failures.

Quentin Clark, corporate VP of data platforms at Microsoft, stated "We believe Hadoop is the cornerstone of a sea change coming to all businesses" during his keynote during the Hadoop Summit.

Microsoft's data platform includes everything from its SQL Server
database products to business intelligence (BI) features in Excel. In
addition, playing well with Hadoop's open source community and leading
Hadoop proponents like Hortonworks is a key component of Redmond's big
data strategy. You can read more on this subject at informationweek.com