Is your organization a startup or in its early stages? In their new member blog, Alvarez & Marsal Taxand shares opportunities around the 2018 Qualified Small Business (QSB) R&D Tax Credit for your small business. Read on to learn more!

Last year, many early-stage companies significantly reduced payroll taxes thanks to a new federal tax credit including many of our startups in the tech industry. The Qualified Small Business (QSB) R&D Tax Credit, passed into law in 2015, allows qualified small businesses to offset OASDI (i.e. social security) taxes with R&D tax credits originally claimed on federal income tax returns. The 2018 calendar year presents yet another opportunity for companies to realize these savings. How are these credits achieved?

1. Assess Eligibility – Companies must have less than $5MM of revenues and no revenues prior to 2013. The credit is most applicable to companies with early and high expenditures in R&D before starting to generate revenue. Common examples include companies in deep technology, biotech & life sciences, and manufacturing. Further, companies with at least $300K in annual payroll are most-likely to realize a material benefit.

2. Plan Ahead – Companies seeking QSB R&D Credits this year must first claim the credit on 2017 federal income tax returns. Since a company may not amend an income tax return for the credit, it is important that companies interested in this credit NOT file federal income tax returns without first assessing whether or not they qualify for the credit. Remember, savings cannot be realized until one quarter AFTER the income tax return is filed. Thus, if you’re planning on taking advantage of the QSB Credit opportunity DO NOT WAIT! 2017 Returns must be filed before March 31st, 2018 if you want to bank these savings by the second quarter of 2018.

3. Seek Assistance – While QSB Credits often produce significant savings, they are not a “free lunch.” Not only do companies without experience in this area run the risk of leaving savings on the table, but they could also incur interest and/or penalties related to payroll tax mistakes. Reach out to an advisor for assistance in assessing if QSB Credits make sense for your business.

Our team at Alvarez & Marsal Taxand is ready to assist any company interested in these savings. For more information, please click here.

Share and Enjoy

By Nathan Self, Research Associate, Department of Computer Science, and faculty at Discovery Analytics Center, Virginia Tech. Self will be participating on the Opportunities, Challenges and Future Trends in Advanced Analytics panel at the second annual Capital Data Summit on Feb. 28, 2018.

Imagine your biggest spreadsheet. Too many rows and columns to take in at once. At the Discovery Analytics Center (DAC) at Virginia Tech we are interested in how humans and machines can work together to make sense out of all that data. Andromeda is an example of how analysts can combine sophisticated machine learning algorithms with interactive visualization to get insights from their data.

Let’s assume that your enormous spreadsheet has a row for every customer and a column for every statistic you keep about each customer. Andromeda draws a scatterplot in which each point represents a customer. Points that are close to each other represent rows that are similar to each other. Likewise, distant points represent dissimilar rows. To begin with, Andromeda assumes that each column is equally important.

This is where the human aspect comes in. You know what the data actually represents and you can interact with Andromeda in two ways. You can either (1) change the importance of a column. The points of the scatterplot will regroup to preserve the “near is similar” constraint. Or, (2) you can reach right into the scatterplot and move points closer or farther from each other and Andromeda will compute which columns have to be important for them to be considered similar.

Andromeda, with new algorithms and new paradigms of user-algorithm interaction, serves as a good example that complex statistical methods do not necessitate complex user interfaces or expert users. The selling point of Andromeda is that you don’t have to know that Andromeda uses an algorithm called weighted multidimensional scaling to lay out the scatterplot. You don’t have to know that data scientists at DAC developed inverse multidimensional scaling to handle interacting with the points. In fact, users have effectively generated insights despite having no experience with these, or similar, algorithms. And, their insights are more complex than when they use spreadsheets alone.

There are countless pivotal statistical processes that are well-suited and useful for current data analysis needs. Any of these would be well-served by intuitive interfaces for people that are not experts in statistics.

There is an adage that the same statistics can be used to justify either side of an argument. Machine learning has the same malleability. Understanding exactly what a machine learning tool like Andromeda can do for you — as well as its limitations — is important for deciding what to do with its outputs.

Share and Enjoy

The chip vulnerabilities announced two weeks ago affect almost every single PC, Mac, laptop, tablet, and smartphone created in the last 20 years. Passwords, personal information and any secure information on a device are at risk.

How It Works
This malware affects the “kernel” or core of the operating system, which acts as a bridge between the hardware and software of a machine. The core handles everything from typing and clicking to opening and running applications like web browsers and Microsoft Outlook. The core provides each process with the resources it needs to function and keeps the processes isolated.

When exploited, this security flaw allows an attacker to subvert this isolation and read all of the protected data on a computer. These data could include anything from passwords to personally identifiable information from your tax program. These attacks can be especially damaging for cloud services because they run shared setups where users share hardware but are isolated by software. By hacking one user’s cloud instance, hackers can use these security flaws to see all the data on the shared hardware. Fortunately, the security flaw is difficult to exploit, and attackers must compromise a machine before they can exploit these chip vulnerabilities.

What’s Being Done?
Companies like Intel, Apple, Google, and Microsoft have released patches to defend against these security flaws. The patches further isolate the core’s memory but may degrade performance by as much as 30 percent. Most vendors report that general computer users will not see such a large decrease in performance.

LMI is implementing a remediation plan to patch all vulnerable systems. We will test the patches against a pilot group of machines before releasing them to the rest of the organizational ecosystem. This will allow our team to identify and address potential issues with the patches to maintain LMI operations as the patches are implemented.

To protect your personal systems, install patches and operating system updates as soon as they are released, and make sure your web browsers are up to date. Most browsers automatically update, but it is beneficial to verify that they have been patched.

Looking Ahead
As an organization directly involved in the cyber space, LMI is aware that security flaws and exploits will continue to be a concern. This specific cyber scare is far less concerning than the number of security vulnerabilities we saw in 2017. We expect to see even more vulnerabilities in 2018 because of the evolving nature of hackers and the spread of far-reaching security flaws. Our team will continue to adapt to the ever-changing cyber threat landscape as threat actors change their tools and techniques.

Jonathan Stammler is the Information Security Manager for the Enterprise Technology Services group at LMI. He received an MS in information security from Georgia Institute of Technology and a BS in information technology from George Mason University. If you’d like more information on how LMI can assist your organization with its cybersecurity needs, please email Jonathan.

Gartner predicts there will be an estimated 8.4 billion IoT devices by 2020. Tenable President, Chief Operating Officer and Co-Founder Jack Huffard discusses how the proliferation of digital assets and connected devices are creating an exposure gap in cyber defense — and shares how organizations can fight back against cyber-attacks. Huffard participated on the Successful Cybersecurity Growth Companies In the Region panel at the Capital Cybersecurity Summit on Nov. 15, 2017.

It’s been more than two years since the Office of Personnel Management (OPM) disclosed one of the largest data breaches in history, but just last week, the agency’s inspector general gave them a failing grade when it comes to critical areas like risk management and contingency planning.

In addition, the data breaches and attacks we’ve recently seen across a variety of industries, including entertainment, critical infrastructure, retail and finance, make it clear that all organizations are still failing when it comes to basic cyber hygiene.

Today, a company’s assets range not just from laptops to servers, but include mobile devices, internet-connected appliances and the cloud. The latest research shows the number of these assets are only going to increase. For example, Gartner predicts there will be an estimated 8.4 billion IoT devices by 2020. And according to a 2016 IDG Enterprise Cloud Computing Survey, 70 percent of organizations already have apps in the cloud and 16 percent more will in 12 months. This modern, elastic attack surface, where the assets themselves and their associated vulnerabilities are constantly expanding, contracting and evolving, has created a massive gap in organizations’ ability to truly understand their cyber exposure at any given time.

Another major component of today’s elastic attack surface is operational technology (OT), particularly given the growth in the risk of cyber-attacks against critical infrastructure sectors. A recent Ponemon Institute study on the state of cybersecurity in the U.S. oil and gas industry found, for example, that OT targets now comprise 30 percent of all cyberattacks. Like cloud and IoT assets, the cyber exposure gap is exacerbated by the mismatch of cyber measures deployed by critical infrastructure companies and the rapid pace of digitization in operations. Operational technologies present an additional challenge – they often can’t be assessed with the same approaches as IT assets, creating blind spots for security operations and compliance teams.

We recently announced a partnership with global engineering and technology leader Siemens that aims to address those unique risks. The product, Industrial Security from Tenable, was designed specifically for industrial control systems and will be delivered through Siemens to give energy and utilities companies full visibility into production networks to reduce compliance risk and their cyber exposure.

Both public and private organizations in every sector need to change their approach to cyber risk to effectively manage their cyber exposure. That starts with understanding and protecting what matters most across their entire attack surface. And it means looking at server and endpoint hardening, IoT discovery and hardening, container and web app vulnerability identification and OT asset and vulnerability detection.

Understanding risk and cyber exposure is also an awareness issue that should start at the top. If the C-suite and board of directors know which areas of their business are secure or exposed, that knowledge can drive strategic business decisions, including where and how much to invest to reduce risk. Attackers will always find the weak link, and right now there are too many weak links – even more than companies are aware of.

This year alone, there were several high-profile, large-scale cyber-attacks, including the NotPetya destructionware, CrashOverride/Industroyer threats to critical infrastructure, and the Reaper IoT botnet. No organization wants to experience one of these security headlines firsthand, which claimed millions of dollars in company damage and compromised sensitive customer data. Only with a holistic approach that starts with basic cyber hygiene – visibility to identify all assets and their vulnerabilities – can companies secure today’s complex attack surface.

Share and Enjoy

NVTC’s newest guest blog post from Exostar explains why new government regulations are giving organizations a fresh concern when it comes to cybersecurity. Exostar’s Senior Vice President of Product Development Vijay Takanti will be part of the panel discussion, NIST 800-171: Is the Government Paving the Way for Commercial Security? at the 2017 Capital Cybersecurity Summit November 14-15.

Cybercrime is on the rise, and could cost businesses over $2 trillion by 2019. These losses could be the result of outright theft, lost productivity, impact to customer confidence or costs associated with repairing breaches. But a new, equally ominous risk associated with cybersecurity is emerging for both government contractors and downstream commercial businesses—the risk of losing current and future contracts due to non-compliance with new government standards.

Department of Defense contracts now include a clause, DFARS 252.204-7012, “Safeguarding Covered Defense Information and Cyber Incident Reporting.” The new clause requires contractors (and their extended supply chains) to implement NIST SP 800-171 cyber safeguards by December 31, 2017 – or at least have a coherent plan for doing so.

NIST SP 800-171 is a set of 110 security controls regulating the handling of sensitive (but not classified) data. Most organizations in the aerospace and defense industry are well aware of these standards and their application to the DFARS mandate by now. However, other organizations, who don’t work directly with the government, may get pulled into NIST 800-171 compliance because of the global, multi-tiered nature of prime contractors’ supply chains.

Keep in mind that the supply chain on any given project can include hundreds or even thousands of suppliers who are privy to controlled defense information (CDI). As the volume of suppliers and the information they exchange rises, the more vulnerable they are to cyber-attack and CDI compromise. Even small pieces of information need to be protected at all times.

The NIST 800-171 rules are designed to best protect this sensitive information as it moves across every level of the supply chain. If even one link in the chain is insecure, it could spell trouble for all parties participating on a government program. Officially, the government can start including NIST 800-171 compliance as a requirement for contracts once the rules are in effect. If organizations are not compliant, they will not be able to bid on those contracts, and existing contracts could be in jeopardy.

Organizations that are not compliant with these new cybersecurity controls run the risk of losing out on business, as primes and larger suppliers select preferred vendors who can demonstrate proper cybersecurity hygiene.

The deadline is looming. Mitigate the latest cybersecurity risk by understanding and implementing the NIST 800-171 security controls now, or find a qualified partner to help you do so.

Share and Enjoy

Telos Corporation CEO and Chairman of the Board John Wood addresses cloud security in his new guest blog. Wood will be moderating the State of Cloud Security and Compliance panel at the Capital Cybersecurity Summit on Nov. 14-15 at The Ritz-Carlton, Tysons Corner.

It’s not exactly clear when the term “cloud” was first used to describe shared pools for configurable IT resources. However, it’s safe to say that it started creeping into our lexicon less than ten years ago.

Back then, the official definition of cloud was even less clear than it is today. Regardless of what the cloud actually was, this mysterious cloud entity was widely assumed to be unsafe.

That said, even from the beginning, I saw that the cloud offered many security advantages, especially to smaller companies that couldn’t afford to make infrastructure investments and hire many highly-skilled staff to manage complex IT systems in their own on-premises data centers. Still, doubts about cloud security swirled.

But in 2014, a crazy thing happened. Defying conventional wisdom, the CIA, arguably the most security conscious organization in the world, announced their plan to work with Amazon Web Services (AWS) to adopt commercial cloud services. Shortly thereafter, C2S was born.

Even though countless other agencies had already adopted the cloud by 2014 – the CIA and C2S gave the cloud instant credibility. It made federal agencies and highly-regulated commercial organizations realize that if cloud technology is good enough, and secure enough for the CIA, then it must be secure enough for them. Granted, the C2S is an isolated environment, it was noteworthy that CIA made the often trumpeted “cloud first” policy a reality.

AWS recognized early on that security was important to ensure continued, widespread adoption of cloud services. For this purpose they introduced a shared responsibility model to help explain the security benefits you derive simply by hosting your workloads within AWS. Under this model, the customer is responsible for security in the cloud, and AWS is responsible for security of the cloud.

Not only does this shared responsibility model help address a number of security questions, especially in the areas of infrastructure and physical security, it also helps clients demonstrate compliance requirements more quickly and efficiently, because they can inherit results directly from AWS.

AWS certainly isn’t the only cloud service provider (CSP) in the game – Azure and Google also understand how important the message of cloud security and compliance is to drive further cloud adoption.

Despite all of this it is essential for organizations to understand the potential security pitfalls of cloud adoption. It’s essential to know where your cloud service provider responsibility stops and customer responsibility starts. There have been a number of recent breaches resulting from unsecured cloud-based database deployments. Customers need to understand, and take seriously, their responsibility in protecting their systems, their applications and their data.

The cloud has come a long way over the last ten years. Much progress has been made to enhance security and promote these security and compliance benefits. However, there is still work to be done to address lingering security concerns, questions and perceptions to help drive even broader adoption of cloud services.

If you’d like to hear what CSPs have to say about the myth of cloud insecurity, join me on Wednesday, November 15 at NVTC’s Capital Cybersecurity Summit. I will be moderating a panel that will discuss the current state of cloud security and compliance, featuring prominent voices from the big three cloud providers: Google, Microsoft and AWS. I hope to see you there!

Share and Enjoy

The world’s total digital data volume is doubling in size every two years, prompting organizations to find new ways to secure their complex data. In their new NVTC member blog, LMI provides tactics for determining cybersecurity threats in your organization’s digital supply chain and securing critical data.

The world’s total digital data volume is doubling in size every two years, and by 2020 will contain nearly as many digital bits as there are stars in the universe. Most of this data is created and communicated over the Internet, whose “population grew by more than 750 percent in the past 15 years to over 3 billion. This population shares more than 2.5 million pieces of content on Facebook, tweets more than 300,000 times, and sends more than 204 million text messages—every minute.”

With the advent of the Internet of Things and other innovative technology platforms, organizations must continuously analyze and secure their complex data. For supply chain operations, digitalization has enabled leaders to access data faster and build stronger connections within a given supply chain. While there are clear benefits of the digital supply chain, there are challenges that need to be overcome in order to realize its full potential.

The volume of data is skyrocketing as diverse data sources, processes and systems show unprecedented growth. Companies are trying to capture and store everything, without first establishing the data’s business utility.

The fact is, technology is enabling this proliferating data complexity—continuing to ignore the need for an enterprise data strategy and information management approach will not only increase “time to insight,” but it may actually lead to incorrect insights.

Perhaps, none of these challenges is as critical as an organization’s ability to successfully secure its supply chain data given the IT security risks posed by the Internet. In fact, 30 percent of supply chain professionals are “very concerned” about a data breach.” The concerns of these professionals are well-founded. The number of cybersecurity breaches is growing by 64 percent every year with 60 percent of cyber breaches linked to insiders—current and former employees, contractors, service providers, suppliers and business partners.

Unfortunately, many organizations are unaware of the security vulnerabilities within their supply chain or how to determine those vulnerabilities. To help organizations determine their vulnerabilities, start by answering the following three questions:1. How will the product be used and managed in the system? While any system breach is bad, the compromise of a system managing classified data is a much worse than a system that is managing publically available data. Understanding the use of the Information and Communication Technology (ICT) equipment will help determine the resources appropriate to secure the system. In reviewing the product use, consider what other systems are connected to the focus system. A less secure system can serve as a pathway to attack a more highly secured connected system. This was the method used to steal credit card numbers from Target in 2013.

2. How is the system connected to the rest of the world? A system that is connected to the public Internet will need more reliable security, since it would be easy to find and attack. On the other hand, a system that is isolated from any other network would have a much lower risk of attack or data breach, since the attacker would need to be in physical proximity of the system.

3. Who are the system users? Are the users internal employees who are trained on security procedures or is the system accessed by a public user base which may not consider risky security behaviors? Simple security procedures, such as keeping passwords secret and maintaining current anti-virus software, cannot be counted on if you do not directly control users’ environments.

By answering these questions, organizations could quickly and effectively determine the security vulnerabilities within their digital supply chain. Organizations can also contact our cybersecurity experts who can help you monitor, prioritize, and effectively manage your risks to create an optimal level of security based on mission priorities and resource constraints.

Share and Enjoy

What do cloud and AI mean for human resources? Will automation replace human resource functions and associates? Read on to find out in Insperity’s newest NVTC guest blog.

Cloud-based tech solutions for human resources offer the promise of easy installation and implementation, but does such software really eliminate the need for HR staff?

The short answer is no.

While new technical offerings can improve the efficiency and speed of many HR processes, the human touch is still needed to get the most out of the software.

For example, you’ll still need someone to “operate the machinery,” so to speak, or administer the software. In a smaller company, that may be one combination payroll and HR person. In a larger company, you may need one employee to do nothing but maintain, update and run the software so that your company gets the most from its capabilities.

When HR software works best

Technology is your friend when it comes to the tactical aspects of human resources. For instance, an online time tracking system that ties to your payroll and government reporting systems can save significant time and improve accuracy over manual tracking and handwritten reports.

Tracking of critical HR data related to hours worked by project or department, turnover and more

Garnishments, reporting and mandatory requirements that vary by state

For example, a company operating in a big state like Texas may not be accustomed to the HR complexities of hiring across state lines. But open an office in New York, and you could have employees who work in that state but live in Connecticut or New Jersey.

HR software can help ensure your compliance with multiple states’ payroll tax requirements, and prevent you from having to learn and implement such widely disparate laws on the fly. The best-case scenario is when you have the right software in place to facilitate efficiency and compliance, with access to experienced HR professionals to guide you.

What to look for in HR software

Once you’ve decided whether an HR software package delivers the basic functions your business needs and will help drive company goals, it’s time to take a deeper dive into its functionality.

Some questions to consider:

What purpose will this software serve? Will it eliminate, add to or integrate with your existing systems?

Who will administer the software? Will they require extra training? If yes, how much? How much training is included in the price?

Is this software backed by HR on demand? For example, even with the best software, you’ll still have the occasional compliance question. Look for a software solution with human support.

Will this software integrate with other existing software for payroll, time and attendance, or enterprise resource planning (ERP)?

As you talk to software vendors, it’s vital you involve frontline workers who operate existing systems to help you evaluate any new HR software and its integration requirements. Depending on your current set-up, this may mean you bring in the payroll administrator, ERP data manager, compliance officer or the HR specialist managing the current performance system.

These are the people who can help you avoid the costly mistake of buying software that ultimately will not “play nice” with your other systems, since they know the intricate details of how your existing systems really work.

Why leadership is still needed

Yes, software can help a company align its objectives and drive engagement through performance management, employee feedback mechanisms, people analytics, training, and compensation and rewards systems. But no software will ever replace a leader who communicates, inspires and motivates employees to achieve the organization’s goals.

As a business grows, it becomes harder to keep employees aligned with the company’s goals and strategies. Software can help keep your ship on the right course, but at the end of the day, any technology solution is only as good as the people behind it.

Share and Enjoy

NVTC’s newest blog is by Dovel Technologies Vice President Mike Atassi. Atassi recently moderated the healthcare data analytics panel at the inaugural Capital Health Tech Summit on June 15, 2017. Scroll below for full video of the panel.

Data is being generated at unprecedented levels – with more than 2.5 quintillion bytes being created every day. Unlocking the potential value of this data will help accelerate research, develop targeted therapeutics and improve the delivery of healthcare. Today’s information and computational sciences and technologies are playing a critical role in delivering better healthcare to everyone.

Accelerating the path to discovery and finding targeted therapeutics to address some of the most chronic diseases is a promise that can be largely fulfilled with exploiting available data. Whereas primary investigation has been the most important source for generating data and discoveries, today we see how data scientists are curating existing data to make it searchable, accessible, interoperable and reusable.

A panel of experts discussed the role of data analytics in the continuum of health at the recent NVTC Capital Health Tech Summitandprovided valuable lessons on how to protect, govern, and transform data into valuable information and health insights. The panelists discussed different ways to enable health data to be searchable, accessible, interoperable and reusable. Key themes from the discussion included:

Building a data-rich infrastructure: Incorporating genomic and proteomic data into clinical delivery is a challenge that is being met with innovation in technology and information architecture, transforming large, disparate data sets into consumable, actionable packages.

Utilizing advancing technologies: Deploying machine learning and predictive analytics alongside data, processes, and the workflows that already exist within hospitals can help to predict and prescribe new protocols. For example, the use of predictive analytics and machine learning resulted in a 39 percent reduction in patient falls in just six months at a local hospital.

Improving wellness: Enabling the delivery of integrated wellness, disease management, and healthcare services to the community based on insights from data. For example, data analytics is playing a key role in improving the effectiveness and global efficiency of transfusion medicine and cellular therapeutics.

Reducing risks: Helping to prevent the spread of major diseases, such as the Zika virus, by integrating datasets from multiple sources to identify geographic risk patterns. Data also allows for the benchmarking of activities to guide decisions that will make sure that the right person gets the right treatment at the right time.

Preparing a new generation of data scientists: Bringing together interdisciplinary individuals with domain and technology expertise to develop leading public health and precision medicine professionals. Today, many institutions of higher education are offering advanced degrees in data sciences – combining the knowledge of biological sciences with computational and mathematical sciences to provide a generation of data scientist capable of unlocking values hidden in large and complex data systems. Data scientists today are already showing tremendous progress in biomedical computing in terms of developing meaningful solutions for analytics, visualization, as well as data management and governance.

With these advances, real challenges remain with limitations raised by ethical, legal, procedural and even technological constraints. In order to successfully meet these challenges, the industry must use a sound foundation of proven techniques and processes to ensure predicable results. However, the continued convergence and collaboration of biomedical sciences and technologies – along with increased demand for precision healthcare – will provide the impetus to meet these challenges and deliver real breakthroughs for better health.

Share and Enjoy

In Asurion’s new member blog post, Senior Vice President of Retail Application Delivery and Voice Services Sean Nassdiscusses embracing new product development models rooted in collaboration and focused on outcomes. A global leader of connected life services, Asurion partners with leading wireless carriers, retailers and pay-tv providers to provide consumers with protection and premium tech help supporting mobile phones, consumer electronics and home appliances.

There is a big shift in how technology companies are going about business.

Many are pivoting from a process-and project-based strategy to one that is more forward-thinking. Team members are crossing boundaries, blurring the lines of previous structures and coming together in the spirit of collaboration to deliver outcomes that exceed customer expectations.

In the old model of business, the technology focus was on delivering large projects using project managers and a pool of resources, defining and limiting capacity. Instead of focusing on an outcome, teams would get together and create big requirement documents with minute details that would bog down capacity, forcing a project through months of work and still frequently achieving a result that was somehow different from how it was initially envisioned. Opportunity costs were often lost under this old model of product development.

Now businesses are pivoting, with a more forward-thinking attitude in mind. At Asurion, we recently built what we call journey teams, individuals across key sectors of our business who come together to optimize the speed with which a project achieves its desired outcome and experience for consumers. As part of this shift, we merged product, design, technology and customer experience teams to optimize the process with the project’s outcome in mind rather than focus on the process itself. The days of separate “product” and “IT” silos are behind us. We’ve combined product, design and technology teams and have empowered them to ask the question “How do we focus on what’s best for the consumer experience?”

Take the claim process as an example. Under previous models, a customer’s claim would pass through various workflows, often with redundant or unnecessary steps that may not have been a great experience for the customer. Under our journey team model, we dedicated product, technology and design leads to focus on an outcome that equates to a positive customer experience. This mentality leads to faster time to market and less waste in resource capacity, and allows our team members the ability to innovate in a rapid fashion. More importantly, the customer has a really positive experience.

We don’t tell our journey teams what to do or how to do it – instead, they innovate and test ideas and are empowered to make decisions on their own, all with this singular goal of improving consumer experience. The journey teams put together a vision based on a desired outcome, a vision that nails down what is going to work and what isn’t to drive improvements in speed, reliability and efficiency of a product’s delivery.

The shift has opened up new channels of communication and new ways of interacting across teams, even to the point of how we collocate in our workspace. We have seen a radical change in the quality of our intercommunications because people are developing prototypes, conducting tests and not working off huge requirement documents.

Our goal is to create a seamless integration of product, technology and design that optimizes the experience for our customers.

It hasn’t all been easy, but progress doesn’t occur without change. We certainly can’t transform all teams into this model at once. However, with patience and modeling teams’ successes, we are seeing increases in quality and speed, and the enthusiasm of the teams is amazing. They are so engaged because they see a direct correlation in their work and how it dramatically improves a customer interaction. There’s more alignment among product, technology and client services than we’ve ever seen before. If you are thinking of trying something similar in your offices, I recommend forming a shared goal, a shared alignment across all teams and placing the focus on future growth. Your efficiency and product development and delivery will improve, and that’s what everyone is looking for, after all.