Application Servicestag:www.infosysblogs.com,2010-03-19:/application-services//532017-07-31T04:46:01ZApplication Services provides a platform for IT Development and Maintenance professionals to discuss and gain insights into best practices, process innovations and emerging technologies that will shape the future of this profession.Movable Type 5.14-enVR and AR - Digitizing Presence to Make Up For Its Absence.tag:www.infosysblogs.com,2017:/application-services//53.104132017-07-31T04:35:17Z2017-07-31T04:46:01ZVirtual Reality (VR) and Augmented Reality (AR) are two emerging technologies with huge potential to disrupt how we interact, communicate, learn and perceive things. New breed of devices starting with Google Glass, Oculus and HoloLens are commoditizing this technology.Shahnawaz Qureshi

Virtual Reality (VR) and Augmented Reality (AR) are two
emerging technologies with huge potential to disrupt how we interact,
communicate, learn and perceive things. New breed of devices starting
with Google Glass, Oculus and HoloLens are commoditizing this technology.

It all started with audio, first of human perception (of
sound) that could be recorded, transmitted and reproduced paving way to initial
breakthrough of modern communication where interaction was possible without
being physically present. With advance of time, humans craving for capturing
the complete experience and desire for remote interaction has made Technology
evolve from Audio, Visual to Virtual Reality.

Virtual Reality is a technology enabled digital environment
that is perceived as real. Virtual Reality brings digital experience to life as
though it physically exists. The fact that one can experience something
without its presence is a game changer. Humans are better at learning, reacting
and deciding about something thing when they experience it. Virtual Reality
provides this opportunity by transforming the surrounding to something that one
wants to experience and be engaged without having to practically develop or
create the environment. Imagine lying in the bath tub with a headset and
swimming with the Sharks without being its lunch, yes that's Virtual Reality.

A close cousin to Virtual Reality is Augmented Reality (with
some of its flavor being referred as Mixed Reality). Augmented reality is all
about overlaying your actual surrounding with digital information that enriches
it. Imagine watching Eiffel Tower in Paris with your headset and along with the
actual Tower, you get to see its actual height, Year it was erected and other
information about it, yes that's Augmented Reality.

Though starting initially as an experiment and making its
way to Entertainment, AR and VR are today making its inroads to various
industries. After early day simulators, Video Game industry was among the first
one to leverage VR for providing immersive experience for their gaming fans. A
parallel similar set of experience was being created with AR games that were
being successful by adding gaming experience to real life environment. Pokemon Go of our
times is one of the pinnacle achievements in this area, in fact it was one of
the application of AR that touched most people.

In Retail Industry, AR is the next big thing. Retail being
all about buying, AR and VR are best when it comes to convincing customer and
make them experience how a specific product would suit their needs without
having to purchase it in the first place. IKEA's
Catalog app for smart phone is good
case in point, it helps its customer to choose the furniture from its catalog
and place them virtually in the living room using the Smart Phone camera and
evaluate it before purchasing it. On similar lines IKEA has leveraged VR to
create Virtual
Store where one could visit life size IKEA store and do shopping all
sitting on your home couch. Online
retailing giant, Amazon has recently offered its own take with AR, it's gone
ahead with marrying AR with a dedicated hardware camera and added a touch of AI
to create Echo
Look, an Alexa enabled camera that helps you try clothing virtually before
youorder it online. On a larger scale
here is AR driven Fitting
Room to try on clothes without actually trying it, pretty cool right? J. This trend of AR application in Retail is
being embraced by many retailers like Converse, Lego and many
more. To summarize in short the Mantra here has been very simple "Try it virtually before you buy it!"

This trend is retail is not only increasing the online business
and providing better customer experience, but it also is reducing the return
rate of items which usually stems from having brought the wrong product.The annual cost of retail
returns is pegged somewhere around $260 billion, VR and AR have good
potential to reduce these numbers.

Manufacturing Industry has been similarly taking great
strides in leveraging AR and VR for innovation and customer experience. Ford
has been using the power of VR for collaborative design in their Product
Development Center located in Dearborn, Michigan.Ford's VR lab lets designers across the globe
to collaborate on design and fully experience it before an actual physical
prototype is created. Manufacturing industry has also been using VR in training
staff on maintenance of their products. On consumer side of things, AR is
helping replace service manuals and provide unique, innovative customer
experience. Hyundai's
AR service manual is a case in point here. Similar innovative solutions can replace
complicate paper based manuals which could soon be a thing of past.

In Healthcare industry VR is taking things to different
level. VR game called SnowWorld
is helping reduce burn related pain. VR is also helping doctors in Surgery,
both in terms of training and perform robotic assisted surgery.VR presents unique advantage in
rehabilitation of patients as it helps to stimulate the actual environments for
patients to train for recovery before actually attempting it, this way reducing
chances of injury and further complications.VR is also proving to be an important tool for telemedicine, especially
to provide healthcare for rural places and seniors at home.

For Insurance VR and AR presents both opportunities and
threat. VR helps underwriters avoid travel to physically assess location before
providing coverage. With telematics and VR, underwriters can actually stimulate
scenarios based on various data like traffic, weather and location to better
rate risks and avoid loss or over pricing. AR could also help in claim filings and
better assessing loss by using AR tools and apps that would overlay real-time
loss picture with financial data affected by it. Similarly there is a whole new
paradigm that could be explored by leveraging AR and VR in insurance industry.
Along with these opportunities there are new threats that comes with AR and VR,
those that emit from its application in other areas. A striking example to
prove this is point is over 110,000
accidents in USA that has been claimed to be associated with Pokemon Go AR
game in its first 10 days. This demands for a change in whole rating model to accommodate
the new risks that come up with the application of these latest trends in
different areas that indirectly impact the insurance business.

These are some of the applications of AR and VR across the
industries, it's just the beginning and human imaginations are going to push
this beyond boundaries. From communication perspective we might have discovered
the right tools in AR and VR for culmination of our desire for a complete
experience until we get an opportunity to turn teleportation to realityJ, so keep evolving!

Imagine walking to an ATM kiosk pressing a button and
getting your cash (Forget Cashless world for a moment), no cards, no pins,
nothing it's just "YOU" that matter without the need of any surrogate that you must
carry. Yes, sounds futuristic, but I believe we are at its doorstep. This and
many more magic are possible with the advancement computing leveraging computer
vision and breaking the barrier of traditional computer input mechanism that
have been serving as their senses to react to its users needs till date.

Computing has evolved from its Stone Age era of reliance on
punch cards for its input, to keyboards, mouse, touch and to recent innovations
of voice enabled inputs such as Siri and Cortana. Computers are fundamentally
designed to understand binaries, various input mechanism that's are being used
are approaches that's more convenient for we humans and at the same time has
the feasibility to be translated to binaries. For major part of computing
history, text based input served as the best deal, with this approach there is
always an explicit step to translate the Subject, its desire, its credential and
privileges to alphanumeric representation that computers can work on, hence you
have this cards or usernames that identifies the subject and pins or passwords
that Authenticates the subject. The
input mechanism is just a means to translate the user's desire to the binary
format that computers can understand. With Computer Vision this whole model
changes the actual subjects are translated to binaries that machines can
understand and act upon. Chui a doorbell
with facial recognition capabilities is one such application of this smart
technology that is changing how we interact with machines and realize its
value. The engagement with machines are becoming more direct and implicit.

Computer Vision is convergence of Image Processing, Artificial
Intelligence (Machine learning to be specific) and Big data which enable
machines to perceive things just like how we humans do, in turn making machine
one step closer to replicate human senses to process its surroundings.
Technology today is quite matured in applying Computer Vision to specific
domain like facial recognition or reading vehicle license plate, but it's
challenging to apply it more generically in fields like self-driving car where
there are too many variations that needs to be processed to present the optimal
outcomes, additionally the field of self-driving cars has too many legal,
compliance and ethical issues beyond technology that's needs to be addressed
before they become mainstream consumer technologies. Until then its prime time
for Computer Vision to change how we interact and extend the boundaries of
computers.

In an era of smartphone where every phone is equipped with
camera, its coincidentally ready for the game of computer vision. Computer
Vision in collaboration with Augmented reality has immense potential, where
computing can enrich our physical world with immense data that's available for
us, but instead of we explicating searching this information, computer vision
would help us overlay this important information that matter us the most in real-time,
taking the whole experience to a new level, one where we are presented with
valuable information rather than requesting it. This would be a typical example of technology enriching our senses as
we develop senses for our technology. There are many apps that allows you
to focus your camera on your surrounding and get your valuable information
based on geolocation and subject in focus. Hyundai's
Augmented Reality owner manual is another excellent example of this technology
being applied innovatively.

With the capability, there and maturing, there is immense
possibilities that's conceivable with Computer Vision. Financial Investors are
using satellite imaginary to get real time inputs to economic metrics like
traffic at ports, oil reserves and crop yields. The same method can be used in
health care to check trends in flu and other seasonal diseases by analyzing
parked cars at Hospital and Urgent Care centers. With eyes in sky, property and
casualty insurers can have a better insight to their potential liabilities in
certain areas by observing traffic and accidents in areas they cover. These are
just the patterns that can be extended to different business areas to gain
massively from Computer Vision. With today's world, all about images shot at
megapixel there are trove of information waiting to be harvested, and time is
ripe for innovative use cases that could be leveraged using Computer Vision.

All technologies comes with its own concerns, with Computer Vision where any visual is more than an image, there are huge concerns of privacy as simple camera has the capability of going behind its realm to provide and extract information that otherwise would only be possible through human intervention. All said its a beginning of a new Technological journey that has the potential to make our childhood days science fiction into reality. Computer Vision is well positioned to lead the revolution until we dawn into next penultimate era of computer telepathy where our thoughts are translated to binary to serve as computer commands J.

]]>
A Journey on the line of Architecture Assessmenttag:www.infosysblogs.com,2017:/application-services//53.101932017-03-01T06:16:30Z2017-03-13T13:29:50ZGayathri Rajamanickam

According to the research firm Gartner Group, enterprises generally plan to upgrade 38% of their application portfolios, 24% of them will be replaced, 18% will be retired and 22% remain unchanged. Application portfolio upgrade or replacement can be targeted towards the need for the technology modernization, optimization of LOBs (Line of Business), improving sustainability of the system, etc.

Application architecture assessment plays a vital role in both, green field application development or re-engineering (upgrade) of existing application. Architecture of a system need to be evaluated to rationalize the decisions behind the system design, to review the solution that meets both functional and non-functional requirements and also to ensure quality of the system.

Recently I was engaged in architecture assessment for one of the leading insurance brokers in USA that specializes in developing, marketing & administering customized insurance programs and specialty market solutions. The customer is utilizing a Policy Management System (PMS) which is evolving as a global product over the time that triggered the need for architecture assessment.

In this blog, I am going to share my experience on how the architecture assessment is conducted and processes involved in the assessment.

Architecture Assessment

Architecture assessment is an activity to validate the decisions taken in the existing architecture of the system. It is generally conducted in enterprises where complex software systems are connected to each other to perform their day-to-day business operations.

Plan & Gather

The Assessment starts with discovery phase where brainstorming sessions are conducted with the key stakeholders to explore and understand the existing system architecture, non-functional requirements, current business and technical challenges in the system. The existing knowledge artifacts are used to help in supporting some of decisions taken all through the chronicled changes to the system.

The knowledge captured during the brain storming sessions are also documented in the form of graphs and tabular structure to provide better clarity to the system in place.

A historic graph is plotted utilizing insights about the past releases and future releases with the essentials on technology and business parameters that drive the progressions to the system

Existing system architecture model is revamped to incorporate the critical business themes along with the appropriate mapping to the technology stack

Assess

Technology assessment mainly includes the activities like analyzing and reviewing the various layers and association to each other in the system. Estimation of each layer - in terms of number of services, number of forms/pages, number of business rules, number of tables and so forth., are computed and assessed in the perspectives of minimizing redundancy across different architecture tiers, separation of concerns, single responsibility principle, coupling and coherency across the layers. Technology stack used in each layer are reviewed and quick POCs are performed with the best alternatives available in the market. Static code analysis and detailed code reviews are performed manually as well as using tools and frameworks wherever possible to evaluate the design patterns that are used to meet the business needs of the system.

The best outcome of any technology improvement goes hand in hand with the underlying infrastructure in the complex enterprise applications. There comes a need of infrastructure assessment as well. Here the assessment is performed to appraise the system for its readiness towards cloud, load balancing, and optimization of application environment for different LOBs, inspection of underlying hardware of the systems to understand the performance factors behind the scenes. Size of existing data are measured and data growth rate is predicted based on the current data size to estimate the performance of the system for the future need of the business.

Each module that constitutes the part of the complex enterprise application is mapped to individual functional component. This activity is performed to isolate the redundant implementation of the similar business requirement across the enterprise.

Fitment-gap analysis is performed to evaluate current strategies and processes followed on the deployment model with latest process model like DevOps in order to provide the reliable and incremental process to improve the software delivery process.

Recommend

Following are the typical outcomes of the assessment

Assessment Report

PoA Architecture

RoadMap

Assessment Report

This report will include summarization and detailing out section for key observations and opportunities section. Every key observation will have a mapping on the business impact such as time-to-market, cost of maintenance, etc.

Point of Arrival (PoA) architecture will define the proposed architecture model based on finalized assessment. PoA will address the existing challenges that are present in the enterprise application along with the proposal to make the system better fit for the future needs of the business. The proposal can include technology upgrade, modernization of legacy system, modularization of the existing system, re-engineering of certain critical components, etc.

RoadMap

Recommended RoadMap is presented to the enterprise to leverage the best possible outcome of the assessment. As part of this roadmap, recommendations are categorized on the basis of short term, medium term and long term business benefit to prioritize the opportunities for implementation.

Conclusion

In this blog, we have seen methodologies and procedures involved in the life cycle of an architecture assessment. An assessment is not a strategy to solve a single problem. It is an approach towards the betterment of the system. I'm sure, a well-run assessment will definitely help the client in improving their business and accelerating their growth in the competitive market.

]]>
]]>
Things are Getting Chattier - Chatbotstag:www.infosysblogs.com,2016:/application-services//53.100562016-11-15T05:34:09Z2017-07-12T10:57:07ZChatbots are latest innovative interface that our systems (Both internal and External) could leverage. Unlike conventional interfaces, Conversation Platform makes systems intuitive and easy to be used and flattens the learning curve. This blogs introduces you to this latest application model that has immense possibilities.Shahnawaz Qureshi

In today's world messaging is ubiquitous and defaults to our
preferred way of communication. From Pigeons to Chatbots, this service has
undergone a phenomenal change in terms of it usage and capability. Modern era
messaging was initially pioneered by telecom industries in the form of paging service
and then later as SMS which now is fondly called as texting. With the advent of
technology messaging service has broken the boundary, the service which was
once a bread and butter of telecom industry is now provided by various digital
platforms.

Modern digital messaging platforms have extended the traditional
vanilla messaging service to being more feature based and affordable. Gone are
the days when you just messaged plain text in abbreviated form, today message
can contain any forms of media. With ever growing user base being engaged in
this medium messaging as a platform was bound to leap to the next level of
innovation - Chatbots. Chatbots or Conversation Interfaces are computer
programs that are capable to respond to human conversation. Chatbots are the
outcome of convergence between Messaging platforms and AI (Artificial
Intelligence), the advancement of both of these fields has made this happen. If you consider initial SMS era as Messaging
1.0, the advent of digital messaging platform could be considered as Messaging
2.0 and current rising era of Messaging bots augmenting a well-established
platform could be considered as Messaging 3.0.

Chatbots are obvious evolution for our quest for Artificial
Intelligence and simplified User Experience. Instead of developing an
application that needs to be navigated specific way for getting the desired
response, Chatbots gives us a more natural approach of getting the desired
results or response by using natural conversation that we are so much used to.
Unlike conventional interface, Chatbots makes systems more intuitive and
flattens the learning curve. One of the beauty of this application user
experience is the natural documentation that any Chat trails or Chat History
would provide for other users for their efficient usage.

The concept of a bot responding to your message is not
totally new. The well-established text banking which allowed you to text
specific commands for particular actions on your accounts are the predecessors
to today's Chatbots. Appeal for Chatbots comes from the fact its morphs to our
basic and convenient method of interaction - Messaging. Imagine what mobile apps have done to website today by porting their
functionality to native apps, Chatbots today have a similar potential of
porting apps functionality to messaging platform.With mobile being core to our life today and messaging being our
primary interaction, Chatbots is evolving to be as Messaging or Conversation
first approach for our Application Systems..

Facebook and Skype are going big time in exploring and
dominating this space, today they are providing platforms that business can use
to onboard their solutions to conversation platform. Skype already have many
bots that help you from finding cheap airline tickets to getting weather
updates by mere conversation. Facebook with its extensive reach is well
positioned to give enterprise the required platform that could keep their users
engaged and translate it to business opportunities. There are also solutions
where some businesses have leveraged this avenue in a unique way, ICICI bank
has introduced 'iMobile
SmartKeys' which is a keyboard that you can use in your chats to transfer
fund without having to move away from your messaging app. Similarly Google's
GBoard is a custom keyboard that has search integrated to it and ensures
that users does not have to move away from his messaging experience to get
search results into their conversation.

Though revolutionary, Chatbots have their own challenges and
strengths. With Chatbots the interface becomes simpler but the design gets
complicated. Efficient Chatbots have to be fronted by robust Natural Language
Processing (NLP) engines to handle wide variations of human conversation and
channel them to respond efficiently. Everything need not be built from scratch,
the ecosystem to build these solutions are growing and becoming more efficient.
Facebook, Skype and other prominent messaging platform are providing the
required platform to develop bots. IBM Watson, Google and others are providing
robust NLP capabilities. New frameworks are springing up making developers task
easier. So things are getting geared up for systems to get chattier.

Like any other application model, certain use cases are better
served through Chatbots when compared to others to start with, for example
Chatbots are well suited to get answers to questions "How far is New Delhi from
Bangalore" rather than "Show me the nearest route to New Delhi", the later use
case would rather be well triggered through conversation rather than being
served through it.

This trend is just the beginning of new avenue of
opportunities that is ripe for business to harvest to extend their reach and
enable further their business. In today's new world of emerging AI, it's time
for conventional systems to start conversing to humans and leave browsers
behind.

]]>
The art of choosing the right product for business transformationtag:www.infosysblogs.com,2016:/application-services//53.100202016-10-13T05:21:49Z2016-10-13T05:49:41ZGuest
Author: Ravi Narayan Vishnubhotla, Senior Technology Architect

As part of the future-state architecture of IT transformation, certain business requirements need specialized IT applications. These could mandate newer technology systems or migrating from legacy technology platforms. As part of buy versus build evaluation, products that can address such requirements need to be identified.

]]>
However, just because there is a product, it cannot be purchased and implemented without determining the technology, cost, and business impact. Hence, an evaluation process needs to be conducted to compare the various products, features, costs, technologies, and business requirements; and also to understand how they compare against an in-house solution.

Why do I need a product for business transformation? Can I just develop it in-house?

These questions seem pretty straightforward, but answers are not that simple. In short, the answer is 'yes', but it should be noted that any development requires expenses that must be approved. Once these estimates are submitted to the IT governance board and in-house architecture group, the following questions will be asked by peers: What is the basis of estimation? What is the estimation model? Did you comparison with other IT systems or products? How much configuration and customization is needed?

To address these questions, it would be best to conduct a product / solution evaluation, especially for medium to large implementations to gain answers to some the above questions. The end result may be the selection of a product or a customized solution, but the approach taken will assure the IT governance board, business decision makers, and the vendor that the decision was right.

This approach, based on my experience, is most suited for medium to large sized implementations. However, it can vary depending on businesses, markets, or requirements.

2. What is the life cycle or the process?

The following diagrams illustrate the six-step life cycle to achieve comprehensive product evaluation:

Figure 1 - 6 steps to comprehensively evaluate products

Figure 2 - Details of steps to comprehensively evaluate products

3. Any general guidelines?

The following provides key guidelines to follow while evaluating products:

Business needs must be incorporated during analysis, as they incorporate end user requirements

It is mandatory to analyze product and implementation vendors to avoid buyer's remorse

Make sure to conduct this analysis only for medium to large technical implementations, as the effort and time is considerable and unnecessary for smaller ones

Avoid "Checklist Syndrome" (Definition - Process of determining best selection by an Y/N on features and not how well the feature is developed), and while comparing features with requirements, determine the level of the functionality and maturity

Conclusion The steps defined here are based on my experience of working with various clients. The process or approach can vary and will be different depending on the organization and industry. This is not a one-size-fits-all methodology, but should give you a fair idea as to what it takes to achieve product evaluation from an IT perspective.

]]>
Indoor Localizationtag:www.infosysblogs.com,2016:/application-services//53.100112016-10-04T09:30:51Z2016-10-04T12:24:38Z Author: Varun Singla, Technical Test LeadLocation these days carry a lot of importance in indoors as well as outdoors. Outdoor localization, these days is, widely used in many applications with the help of versatile sensors and technologies like GPS,...Guest
Author: Varun Singla, Technical Test Lead

Location these days carry a lot of importance in indoors as well as outdoors. Outdoor localization, these days is, widely used in many applications with the help of versatile sensors and technologies like GPS, accelerometer, gyroscope etc.

]]>
People have been exploiting these technologies in various applications like vehicle tracking, booking the cabs, sending locations through apps like WhatsApp etc. Realizing the importance of outdoor localization, locating humans indoors is also very interesting and important. An Indoor location is as important as outdoor, as it carries a lot of value commercially and for various organizations like IT companies, Shopping Malls etc. Timely advertisements and offers can be displayed on users' smartphones while based on his/her location in the shopping mall to increase the sale. Various organizations are interested in resource management and infrastructure management for maximum utilization.

Locating the user precisely in an indoor environment, that where he exactly is, on which floor and on which location, is the Indoor Localization.This can be achieved through variety of methods:

Figure 1 illustrates the device free and device based indoor localization systems. Device based systems are those where the user has to carry a device with him/her whereas in the device free systems user need not to carry any devices with them. In this blog I will discuss about one Wi-Fi based device based.

Device Based MethodMost common device based methods these days exploit the smartphones of the users as each and every individual carries the smartphones and it does not add to any extra costs. Figure 2[1] below shows the reality of smartphones, and the sensors available in today's smartphones. There are a wide variety of sensors available in the phones, which are capable of doing almost anything. All kind of modalities (e.g., WiFi, cellular, FM radio, Bluetooth, microphone, inertial sensors, etc.) can be used separately or integrally for localization purposes.

Wi-Fi based methodIn Wi-Fi based techniques the existing infrastructure is used as can be seen in Figure 3. First of all, the entire building is divided into the landmarks. Maybe each cubicle can be a landmark on each floor, which means number of landmarks can be equal to the number of cubicles. Similarly, each table can be a unique landmark in a food court.

After this all the landmarks are fingerprinted, with the RSSI values reported by different Access Points from the smartphones. An example can be taken from Figure 3: Let's assume to take the fingerprints at that landmark where the phone is. As can be seen, this phone is reported by three different APs (AP1, AP2 and AP3). Suppose the signal strength of AP1 to this Phone is say X, and from AP2 is X-1 and from AP3 is X+1. Then the finger prints at this landmark will be:

Location loc:

Access Points

RSSI Values

AP1

X

AP2

X-1

AP3

X+1

A database of fingerprints is then developed. When users move around in real time in the organization, this database is consulted every time to figure out which database fingerprints closely match to the reported RSSI values. Accordingly, the location of a person is determined in the real time.

How it can be used in an organization like InfosysThis technique can be used to manage the resources of the company. Heat maps of an organization can be generated, with which it can be estimated which conference rooms, meeting rooms, etc are occupied, or which cubicles are occupied and which are empty.

Fig. 3. An overview of WiFi-based indoor localization systems

One more interesting application of this technique is to manage the staff. An example to this is: if access points be installed in wash rooms or nearby areas, then it can be reflected through heat maps, that which washrooms are more occupied. This information can be exploited by the cleaning staff, or their manager to decide, where more cleaning people are used and where less. This can lead to more efficient use of resources and staff within the organization.

Other Methods:

There are various other methods which can be used for indoor localization other than using Wi-Fi. These solutions include:

Bluetooth based solutions

Acoustics

Wi-Fi + Acoustics

Cellular

Inertial Sensors

Wi-Fi based solutions are easy to deploy as most of the leading organizations these days are equipped with Wi-Fi Access points and achieve these type of solutions, one does not need to spend on any additional infrastructure. So the solution can be achieved in the existing infrastructure which almost no additional hardware costs.

No one would have ever thought that a 10-day project, created at Netscape by Brendan Eich in 1995, would turn out to be the frontrunner for building enterprise web applications after 20 years. Today, JavaScript leads the race for building isomorphic web apps. An isomorphic application is one whose code can run both on the server and on the client. This was primarily made possible by Node.js - an open source, cross-platform JavaScript runtime environment built on Chrome's V8 JavaScript engine, which opened the doors of JavaScript to server-side coding.

]]>
Furthermore, the proliferation of mobile applications as well as the open source movement is pushing enterprises to adopt JavaScript platforms in order to power increasingly modular applications using open source software. Enterprise development shops can no longer depend on Java or .Net to build a technology platform, as the increasing demands for speed and scale of development alongside quicker delivery are challenging the norms. A Forrestor report states that, "Adoption of JavaScript -- and the Node.js runtime environment in particular -- sets the stage for the biggest shift in enterprise application development in more than a decade." However, this transformation can initially be challenging - which is where communities like EnterpriseJS come to the rescue.

With more and more enterprises eager to increase investments in DevOps -- in order to reap benefits like faster delivery, improved customer experience, and better service quality -- it becomes imperative for JavaScript applications to have first-class DevOps support. As frameworks like Grunt and Gulp are already in the limelight, companies like Microsoft have even enabled built-in support for the frameworks in Visual Studio 2015, for building JavaScript applications.

Adoption of JavaScript for enterprise development is further empowered by transpilers like TypeScript and Babel, which evangelize the idea of using object-oriented programming (OOP) in the scripting world. This idea was baptized by ECMA International when they made these features available as part of ES2016, the latest version of JavaScript. ECMA is also currently working on ES.Next, which is touted to be the next-generation JavaScript version.

JavaScript is also making inroads into the hardware world; a fact that is evident from various robotics and Internet of Things (IoT) platforms like Johnny-Five, NodeBots, and Cyclon.js using tools like Tessel, Raspberry, and Arduino that have made the development of robotics and IoT easier and faster.

Any new technology that is disruptive in nature immediately receives support from the JavaScript community. A well-known example of this is 'BitAddress.org,' an open source, JavaScript-based, client-side, bitcoin wallet generator. This is widely used as the bitcoin address, and the private keys for the bitcoin wallets are generated on the user's browser, thereby ensuring security. Another noteworthy mention is Lisk, a platform for building blockchain apps from the ground up, in pure JavaScript.

JavaScript has reached unparalleled heights and will likely continue being the dominant language on the web, for many years to come. Although concerns were raised after the announcement of WebAssembly -- a new, low-level binary format for the web -- Brendan Eich has clearly stated that it would be impossible to kill JavaScript.

JavaScript has also diminished the political divide between applications built using different technologies, like .Net and Java, and the need for specialized technology resources for development shops. A Stack Overflow developer survey from 2016 states that, "JavaScript is the most commonly used programming language on earth. Even back-end developers are more likely to use it than any other language." This was apparent from the fact that developers preferred JavaScript for full-stack (85.3%), front end (90.5%), and back end development (54.5%).

Conclusion

JavaScript will always be a great language when you want high flexibility and fast prototyping. Although legacy apps will continue to benefit from Java, .Net, and mainframe, enterprises need to adopt JavaScript quickly to be on top of application services and build applications faster and cheaper, using widely available developer skills, thereby enabling the transformation for their digital future. Though there may be reluctance from some of the enterprises in adopting JavaScript early, few others are carried away by this new wave of adoption. Since JavaScript is one of the areas which undergoes phenomenal changes in a short span of time, enterprises need to make astute and informed decisions to reap the benefits without slipping from the transformation bandwagon

One of the main objectives of software is to automate work that would otherwise be done manually. This has multiple benefits including cost reduction over the long term, increase in productivity and profits, and the ability to channel human effort towards more important work.

]]>
Like every other industry, automation has immensely benefited the insurance sector as well; and although substantial number of processes are getting automated, there is always scope for more!

Consider the case of policy management systems. We recently came across an application that manages policies and the entire associated workflow - from submission and quote, to bind and issue, and so on. While a majority of the work gets automated, many processes still require human intervention. For example, when application forms are filled, scanned, and sent by an agent to the application intake team via e-mail, the team has to verify and enter the details in the policy management system manually, before an underwriter takes it forward. This kind of work seems fairly mundane, and hence, can be automated such that the application automatically extracts data from the document and saves it, thereby saving significant time and effort. Similarly, there are various other opportunities for automation in processes.

How much automation is desirable?Having said that, how far can automation be taken? In underwriting, one of the most critical tasks is the evaluation and pricing of a risk. While it is possible to automate this task through business rules engines, is it really desirable? If yes, what level of automation is desirable?

The answer to that depends on several factors, such as the type and complexity of the insurance, and the availability of data to accurately evaluate the risk.

In certain types of insurance (standard personal lines, for instance), if the processing volume involved is large, and the risk and margin low, complete automation (straight-through processing) using rules engines might be desirable - an actuary may author the rules for evaluating and pricing the risk (for example, the evaluation may be done based on the answers provided by the customer for a series of questions). The key benefits of such automation would include consistent risk assessment and pricing along with quicker turnarounds, resulting in improved customer satisfaction. Additionally, it would also save the time and effort of underwriters and actuaries who can devote resources to evaluating and pricing more complex and higher-value risks. A survey conducted by LIMRA indicates that two in three life insurance companies, in the US and Canada, have already implemented automated underwriting for at least a part of their business, while another 32 percent are in the planning stages.

In the case of more complex risks, while automation is possible, there could be hurdles -- like the unavailability of adequate data for accurate evaluation (for instance, a physical inspection may be required). Adding to the issues, an automated risk assessment is likely to be more conservative, resulting in missed business opportunities.

Not an all-or-nothing situation

Still, it doesn't have to be an all-or-nothing situation for automation in underwriting. Even for fairly complex risks, automation can still play a role; wherein, although the direct involvement of underwriters in evaluating risk may be required, rules engines could provide additional insights and help underwriters take more informed decisions.

ConclusionAdvances in technology are having major impacts on the underwriting process. While underwriting is being fully automated in high-volume, homogenous lines of business (LOBs) such as standard personal lines, automation is also making rapid inroads into more complex LOBs, where underwriters increasingly rely on guidance provided by modern underwriting systems, models, and analytics to take decisions.

In order to stay ahead of the competition, insurers need to embrace technology in every possible way to improve efficiency and decrease the costs of underwriting. Given the fact that technology is rapidly evolving, insurers also need to keep a constant eye on the changes happening in the landscape.

]]>
My experience with Bare metal provisioning: OpenStack Ironictag:www.infosysblogs.com,2016:/application-services//53.98672016-07-18T09:28:35Z2016-09-04T10:35:17ZCloud! The name itself says a lot. No need to explain. But just think about what were there before cloud. Guess!! Yes, it is virtualization. Entire community was amazed with the capabilities and the feature virtualization technology provides. The ease of maintaining infrastructure and reducing burden on the cost was truly awesome. No doubt about it.
However, when technology evolves further and started new edge on the research and technology, cloud came up. And surprisingly, it started roaming all over the IT sky in a very short time span, it grew like anything. Now everyone talks about the cloud, what why, how and so on. Most of the organizations and products are now moving to clouds and using its benefit.
So, what next! Yes, when we talk about cloud, many people raises their eyes and ask, what about computing performance and for that I have answer, bare metal provisioning in openstack, aka Ironic!!!
Ironic: the openstack bare metal hardware provisioning service
Today, I will shed lights on the setup and challenges faced while implementing the same across projects.
As you might be already aware, the main purpose of Ironic service is to provision the hardware based on the configuration and let the guest operating system be installed on that remotely to have the E2E infrastructure provisioning done
Irfan Sayed
Cloud! The name itself says a lot. No need to explain. But
just think about what were there before cloud. Guess!! Yes, it is
virtualization. Entire community was amazed with the capabilities and the
feature virtualization technology provides. The ease of maintaining
infrastructure and reducing burden on the cost was truly awesome. No doubt about
it.

However, when technology evolves further and started new
edge on the research and technology, cloud came up. And surprisingly, it
started roaming all over the IT sky in a very short time span, it grew like
anything. Now everyone talks about the cloud, what why, how and so on. Most of
the organizations and products are now moving to clouds and using its benefit.

So, what next! Yes, when we talk about cloud, many people
raises their eyes and ask, what about computing performance and for that I have
answer, bare metal provisioning in openstack, aka Ironic!!!

Ironic: the openstack bare metal hardware provisioning
service

Today, I will shed lights on the setup and challenges faced
while implementing the same across projects.

As you might be already aware, the main purpose of Ironic
service is to provision the hardware based on the configuration and let the
guest operating system be installed on that remotely to have the E2E
infrastructure provisioning done.

Components:

·Ironic has three major components

oIronic API

§Talks to Nova compute service

oIronic conductor

§Talks to other openstack services

oIronic DB

§Talks to the different drivers.

Configuration:

·Make sure that authentication system is in place
before executing any openstack command.

·You need to download the rc file from horizon
dashboard and source it.

·Actual command
: source server-openrc

·This file contains all the variables required to
locate each service and url. It asks for the password once you enter the
command. you need to enter the admin password if you are using rc file of admin
user

·Every user has its own RC file which contains
information related to its tenant, projects and credentials etc.

·You need to create the endpoint for service. The
service type is baremetal and service name is Ironic

·Ironic API and Ironic conductor service can be
on different machines. Ironic conductor can be installed on many machines but
there version should be same to have exact function properly.

Database:

·Mysql DB gets used to store all data. mariaDB
prompt comes for all the mysql commands.

·Ironic database and ironic user has to be
created.

RabbitMQ configuration:

·In the first attempt, we see that rabbitMQ
portal was not working. To fix that, we have to install management plugin and
then it started working.

·Get the RabbitMQ username and password from nova
configuration file.

Key challenges:

·While creating ironic database, faced the issue
with sql connection. The issue was, while creating database, service was not
able to access the mysql connection. The reason being, in the
/etc/ironic/ironic.conf file, in the connection section the IP of controller
where identity service is running was provided. Instead, it should contain that
entry which is there in /etc/mysql/my.cnf

Drivers:

·Ironic supports plenty of driver to provision
the hardware and install the OS. There are various 3rdparty providers who have
their own proprietor software's and drivers to work with Ironic.

oThe popular one is IPMI

oInstalled the IPMI utility.

oConfigured the service as it is and restarted
the service.

·It seems that IPMI-tool need IPMI controller
hardware to be present on the machine which is being provisioned.

Configuring Compute service:

·Nova.conf file needs to be modified to add the
parameters required for Ironic to work.

·Sometimes nova.conf file present on the both the
boxes. Compute node and controller node. This is bit confusing. The file which
is present on the node on which nova-scheduler service is running is the main
file and is responsible for all the changes related to Ironic.

·Once all the configuration are in place, restart
nova-scheduler on controller node and nova-compute on compute node.

Enrollment process:

·While enrolling any node, we need to provide the
ironic api version. set the environment variable : export
IRONIC_API_VERSION=1.11

·Need to register the MAC address with ironic
service. If there are multiple NICs , get the MAC address of that NIC which is
connected to LAN

·Node should be in available state so that
compute service can see it to provision the hardware. If the node is in any
another state then compute service won't see it and cannot be provisioned.

·Node cannot be moved directly from enroll state
to available state. First they should move to manageable state and then to
available state.

To summarize, bare metal provisioning is really cool stuff
when you design the private cloud and planning to deploy an application which
requires high end computing and are very sensitive to computing performance. "pxe_wol"
is the easiest driver to learn how Ironic service works and get acquainted
enough to understand capabilities of Ironic. As I mentioned earlier, there are
plenty of drivers, however, they need special hardware support and configuration
to get it working. Try with "pxe_wol" first and move forward.

]]>
IT Transformation is Business Transformation! Why? How?tag:www.infosysblogs.com,2016:/application-services//53.97342016-04-04T13:00:00Z2017-01-19T13:31:45ZIT Transformation is an assessment of People, Process and technology which is a process spanning across a few years. This is an enabler for business transformation and at point, the lines blur and the two are synonymous. This post discusses how and why has IT transformation become synonymous with business transformation and how can it be achieved for business success.Apeksha SinghAuthor: Ravi Vishnubhotla, Senior Technical Architect (Insurance - FSI)

Today, IT transformation (IT Strategy/Application Portfolio Rationalization) has become synonymous with business transformation. In this post discusses why has this happened and how can it be achieved.

Why is IT Transformation same as Business Transformation today?As part of IT Transformation, businesses and clients go through IT strategy for 3-5 years to replace their existing legacy technology with newer or better technology. IT Transformation is assessed in terms of People, Process and Technology. The sponsorship is mostly within IT department and implementation of the strategy is completely IT driven. However, in today's era where growth is measured in terms of revenue / profit / customer service and SLA's, IT becomes an enabler for business to achieve these goals. So when business vision, mission and goals are considered, IT transformation automatically becomes the same as business transformation. Business users play an active role during this process and act as a key driver for the successful completion.

How can Business Transformation be achieved?Business Transformation can be achieved by using the following methodology. This methodology is one of approaches based on my experience which can applied to small or medium sized businesses and can vary depending on businesses or industry. The key principle is to define steps using People, Process and Technology perspective.

Business Vision- Obtain business stakeholders vision of the future of their business; Where do they expect the business to be few years from now (generally 3-5 years depending on the size of business or industry)- Understand overall organization and the business- Understand the core services and business processes- Key concerns / challenges being faced in the business- Define key driving factors of the business- Create a Vision Document and core stakeholder group to oversee the transformation process

Current State Assessment (CSA)-Understand the AS-IS business process and business applications-Conduct discussion sessions with business stakeholders-Document all issues, manual processes, areas of pain points

Future State (CSA)-Define the 'To-Be State' for IT Systems, Infrastructure and business processes-Apply solution(s) to business vision, manual processes and pain points-Consider modern business, IT trends, Industry Standards and guidelines

GAP Analysis-Defines what it will take to go from current state to defined future state-Should consider new business processes, new IT applications-Apply disruptive IT solutions e.g. Mobility, Automation

Develop CBA and Roadmap-Estimate the timelines and effort for the various solution defined-Perform cost analysis by considering price of infrastructure, IT systems (product / in house development), hiring new people, and introducing new processes-Break down solutions into various projects and assign stakeholders either from IT or Business-Propose a Road map to rollout the solutions.

Review and finalize strategy-Review the proposed transformation process as draft via a presentation or a document-Conduct sessions with various business unit stake holders and IT stakeholders-Agree on the proposed solution and roadmap-Refine and resolve any open issues or questions-Baseline Strategy for CTO and CEO approval

To summarize, this is how the business transformation process will look:

The steps defined here are based on my experience, working with various customers and clients. The process or approach can vary and will be different depending on the business and industry. This is not a one size fits all methodology but should give you fair idea as to what it takes to achieve business transformation from an IT perspective.

]]>
]]>
Macro to Microservice Architecture - Shrink or Sink? Part-2tag:www.infosysblogs.com,2015:/application-services//53.95152015-09-28T08:40:18Z2016-07-05T10:56:31ZIn my previous blog, "Macro to Microservice Architecture - Shrink or Sink? Part 1", we explored the basic characteristics of MSA and how it differs from Service-oriented Architecture (SOA). While MSA enables higher service independency, it cannot be applied to all business scenarios. Guest
Author: Archana Kamat, Product Technical Architect

In my previous blog, "Macro to Microservice Architecture -
Shrink or Sink? Part 1", we explored the basic characteristics of MSA and how
it differs from Service-oriented Architecture (SOA). While MSA enables higher
service independency, it cannot be applied to all business scenarios.

]]>
To better understand MSA's benefits and challenges, let us consider
the real-world use case of an online shopper's basket. Typically, the shopping
process involves the buyer browsing through different product catalogs after
which he or she will interact with the shopping basket to add or remove items. Finally,
the buyer checks out the shopping basket by providing required information.

The illustrative class diagram below shows SOA and MSA style
designs for this use case. In SOA, related functionalities are encapsulated in
one service with modular operations/methods. Cross-cutting aspects logic common
at service level is modeled as reusable operations. However, cross-cutting aspects
logic common across all services/ components is modeled as separate
foundational services and consumed by other services. On the other hand, in MSA
style design every function is modeled as an independent service with
independent data access logic, audit logic, etc.

Using the above example, we can analyze the benefits and
challenges of MSA.

Benefits of MSA:

Simplicity - Since each service performs only one function,
it is simpler to understand and enhance

Flexibility - As each service is independent and loosely-coupled
in all aspects, there is greater flexibility for technology selection, design
pattern, modification, replacement, enhancement of functionality, and
deployment

Parallel development and DevOps model - Multiple developers
with different skill-sets can be engaged in development and software
development lifecycle (SDLC) activities. With its high granularity and
isolation, MSA can support the DevOps model, thereby enabling continuous
delivery and frequent releases without impacting availability and stability of
the remaining systems.

Scalability - Individual services can be easily scaled up without
impacting other services

Agility - With better simplicity, flexibility, scalability,
etc., it is easier to incorporate modifications to existing services as well as
on-board new services.

Challenges of MSA:

Chatty interfaces - As seen in the illustration, MSA requires
more interfaces when compared with SOA. To execute a simple workflow of a shopping
basket, the service consumer needs to instantiate and invoke a number of services.

Higher cost - The decentralization or isolation of team, technology
stack, process, and governance can create chaos and increase costs. Further,
fine-grained services can result in overheads in collaboration,
synchronization, etc.

High resource provisioning - Owing to higher number of
services in MSA, there is a greater need for resource usage such as CPU, memory
and network for interaction, data marshalling, containerization, and other
processing needs of chatty interfaces.

Security issues - Security risk increases as more and more
services become disjointed.

Production and operational costs - The cost and effort to deploy,
manage and monitor services is higher for MSA than SOA. For instance, one log
analysis in SOA style may require N log analyses in MSA style for the same use
case.

The good news is that, to handle the challenges involved in
MSA, there are a number of new tools and technology products coming up such as those
for automated deployment, monitoring, aggregation of logs, and other
maintenance activities. However, one still has to consider all existing challenges
before transitioning to MSA style.

Deciding what's best for your business

Enterprises with heavy core lines of business (LOB) systems may
not benefit a lot from a pure MSA approach. Such businesses can adopt a hybrid
approach that incorporates the best of both styles such as moderate granularity
and isolation from MSA along with the cohesiveness, uniformity and governance of
SOA.

MSA may be the best-fit for use cases that do not depend on core
business logic and are peripheral or accelerators to the business. For example,
consider an e-Commerce application that needs to launch a quick promotion
program for special discounts, auctions, etc. Such requirements have a short lifespan
and may last only for a specific season. Here, agility becomes more important
than long-term durability or maintenance. In such a situation, MSA is a better
fit since multiple, disciplined and cross-functional agile teams can execute a "quick
'n dirty" (Q'n'D) but robust implementation and get these promotions on-board
within the desired timeframe.

Conclusion

The success of any architecture paradigm, whether new or
old, depends on understanding the key characteristics and patterns of the architecture
style. This is an important step since most architecture styles may not directly
fit the requirements of an enterprise without some degree of customization. For
instance, to get most from that style architects have to tweak them by customizing,
simplification and sometime complicating it too.

Same was true with SOA. When SOA was introduced as a new style,
many small, medium and large enterprises were able to transition to SOA from
traditional layered architecture style. Many success stories reaped benefits,
but some failures too.

Similarly with micro-services bandwagon, industry is stirred
up with its adoption wave. Before embarking the transition journey in big-scale
it is essential to analyze challenges and benefits relevant to the enterprise
requirements. Some cases granularity and isolation of MSA may pay off, but in
other cases the cohesiveness, redundancy and centralization of SOA may be worth
considering.

After all, no single architecture is a silver-bullet for all
requirements.

The world of software service architecture is witnessing rapid
change owing to a new paradigm named Microservice Architecture (MSA). There are
several debates and questions about this newcomer. Sample these:

]]>

Is MSA over-hyped?

Is MSA opponent to service-oriented architecture (SOA) or
merely a specialized form of SOA?

What are the benefits and challenges of MSA?

To answer these and other questions, let us first explore
the value of MSA versus SOA by understanding their definitions and
characteristics.

Service-oriented Architecture (SOA) is a matured software
architectural style that has evolved over the last decade and comprises of autonomous,
stateless and loosely-coupled services.

Microservice Architecture (MSA) is a budding software
architecture style with a suite of small, granular and independent services.

While both these architecture styles have services at their
core, the differentiating factor is the degree of granularity and independency
(isolation) of their services.

Independency in MSA: With MSA, the services provided should
be independent of each other across the technology stack, standards,
development teams, deployment, infrastructure, governance, etc. Owing to such isolation,
it becomes acceptable to enable service duplication for functional logic,
cross-cutting aspects logic, etc.

Independency in SOA: In SOA, services should be autonomous
and loosely-coupled, which means one service cannot depend on the state of
other services. Though loose-coupling is the objective of SOA too, it
emphasizes on the cohesiveness across
services in terms of reusable functionality, unified technology stack, unified
infrastructure, centralized governance, and coherent teams.

Granularity in MSA: The service granularity needs to be very
small or fine. Although there are no formal benchmarks for the lines of code (LoC)
required for each service, the optimum number to consider is less than or equal
to 100 LoCs per service. Here, each granular functionality is modeled as a
service. Further, service consumers can integrate multiple services to achieve
a business function.

Granularity in SOA: While SOA can also model the
fine-grained services as delivered in MSA, the recommended granularity for SOA is
larger. Granularity is at a discrete business function level instead of a minute
function level. Each service can be further modularized with independent
methods or operations. Common cross-cutting aspects such as data access, logging,
auditing, exception handlers, etc., are isolated from individual services and
modeled as common horizontal services. Thus, SOA recommends loose-coupling and
high-cohesion with balanced granularity.

To meet business objectives of early & frequent business value delivery, faster time to market and better productivity and finding better alternatives for waterfall way of work,companies increasingly adopting distributed enterprise wide global agile.

Simultaneously companies started looking out to renegotiate waterfall contracts for agile ones to sync up legal contract and delivery, working with their own businesses, legal, IT, finance units to devise appropriate agile contracts in collaboration with vendor partners like Infosys.

Agile
contracts are different and challenging as it requires change in mindset among
all stakeholders.

Agile
contracts consider work under consideration as a knowledge creation services,
are more productive as they are based on small chunk of work / scope and rapid
feedback, can be executed through incremental / iterative delivery.