Although I’ve been a huge fan of PlanningPoker.com since 2011, my Scrum Product team consisted of more than five members, and their Free Membership allows up to 5 users. The team I was working with had just started their agile transformation and was trying out aspects of Agile / Scrum they wanted to adopt. They weren’t about to make the investment in Planning Poker for estimations quite yet, so I stumbled across an estimation tool as a free add-on to Azure DevOps.

Microsoft’s Azure DevOps solution is both a code and requirements repository in one. Requirements are managed from an Agile perspective, through a Product Backlog of user stories. The user story backlog item type contains a field called “Story Points”, or sometimes configured as “Effort”.

Ground Rules – 50k Overview

All team members select from a predetermined relative effort scale, such as Tee Shirt Sizes (XS, S, M, L, XL) or Fibonacci sequence (0, 1/2, 1, 2, 3, 5, 8, 13, 21, 34…) All selections of team members are hidden until the facilitator decides to expose/flip all team selections at once. Flipping at once should help to remove natural biases, such as selecting the same value as the team tech lead’s selection. After that, there’s a team discussion to normalize the value into an agreed selection, such as the average value.

Integration with Azure DevOps

The interesting thing about this estimation tool is you can explicitly select stories to perform the effort estimation process right from the backlog, and in turn, once the team agrees upon a value, it can be committed to the User Story in the Backlog. No jumping between user stories, updating and saving field values. All performed from the effort estimation tool.

Team Effort Estimations Are Critical to Accurate Velocity, Maximum Productivity, and Team Building.

The team tech lead may provide an effort estimation with little or no input from the developers and/or testers doing the work.

If the tech lead vocalizes his/her effort estimation…

BEFORE the developer who will be doing the work, the developer may feel pressured to agree with the tech lead’s estimate.

lower than the developer’s guestimate, who will be doing the work, this might create social friction and inaccurate velocity.

WITHOUT a collaborative approach, a comprehensive estimation may be ruled out, such as consideration for not only dev. and test., but infra (configuration management, i.e. build & deploy) and other effort costs.

Using tools like Planning Poker, where all estimations are revealed at once helps the team appear to not contradict one another. The negotiation process occurs after all teammates flip their cards at once. Derives better estimates with more perspectives not factored in based on a single Tech lead providing the estimation.

Transparency and Scrutiny

Many “hands-on” project/product stakeholders want maximum transparency into the current state of the product regardless of the duration of the sprint (e.g. 2-week sprints), Typically, a pulse on the product at two-week increments satisfy most.

Some of the agile, change management tools such as Microsoft Azure DevOps offer dynamic graphing and reporting. Product stakeholders may be provided dynamic dashboards, that include Burn Down, and Burn Up charts based on the sum of effort from user stories (i.e. product backlog items). At any given time charts can predict velocity, and based upon the outstanding, total effort estimation, can chart a course to the next release.

Meaningful burn up and burn down charts rely not just on accurate effort estimations, but the people who are assigned these user stories constantly update the status of these stories, e.g. New; In-Progress; In-Review; Done. Countless times I’ve seen team members update the user story status the day before the sprint close/demo, from New -> Done. This habit gives any product stakeholders a false view of the product within a sprint.

Another challenge and opportunity with Transparency and Scrutiny within a given sprint, is making sure each user story has one or more (child) tasks. Defining tasks provides a wealth of opportunity, such as naming all of the tasks to complete for the story, e.g. database tasks, UI tasks, etc. If the tasks are itemized, they may also be assigned to multiple team resources, and show a delineation of labor.

Sticking with the Azure DevOps tool, Tasks have a default field, “Remaining Work”. This field may express task work in hours or days, the unit of measure. In the beginning, tasks are populated with the total task guesstimate of hours. Each day the person assigned the story task may draw down on the task to incrementally show progress within the task and correlating story.

Task, Work Remaining field must be relentlessly updated across the Backlog in play or else it will create more harm than good. At this level of scrutiny on tasks are amorphous and will be challenging to garnish any projected value.

The Abominable Blocker

What, you can’t figure it out on your own?

The dreaded blocker has the ability to stop a Scrum team in its tracks. The term Impediment used synonymously with the word Blocker, has an innocuous sounding sentiment. Your Scrum team may use either, perhaps a less severe issue merits an Impediment?

The Kanban / Scrum board may have a column in the workflow called Blocker, which should fixate your team on helping to remediate that Blocker. Our Daily Scrum of 15 min may focus on Blockers as they have been isolated in our workflows.

Conquer the blocker before it conquers you!

Applause, Applause

Closing and Demo for Sprints should follow healthy applause from the team, including Stakeholders and Product Owner. Positive reinforcement of a job well done. We’ve completed what we committed to complete, should be followed by applause. We should take a moment to soak in the feedback.

Pass the Mic

For those of us on the Scrum team who are introverts and actively look for ways of dodging opportunities to speak, this one is for you. During Daily Scrum, pass the facilitation mic around where everyone gets an opportunity to facilitate per stand up.

Allow all people within the team an opportunity to demo the “Done” user stories on sprint close. It’s not to break folks out of their shell, it’s to impart a sense of pride in the work accomplished, and truly resonate the one team mentality.

Disclosure: the opinions provided are my own and do not reflect that of my clients, or anyone I represent.

Business or Personal?

Why not both? There are use cases which highlight the value of a Digital Assistant answering your phone calls when you’re unavailable.

Trusted Friends and Business Pins

Level of available services may change based upon the level of trusted access, such as:

Friends Seeking Your Availability for a Hockey Game Next Week

Business Partners Sharing Information access such as invoices

Untrusted Caller Access

The Vetting of Unsolicited Calls, such as robocalls

Defining and Default Dialogs

Users can define dialogs through drop and drag workflow diagram tools making it easy to “build” conversations / dialogs flows. In addition, out of the box flows can provide administrators with opportunities and discover the ways in which AI digital assistant may be leveraged.

Canned / Default dialog templates to handle the most common dialogs / workflows will empower users to the implement rapidly.

Any Acquisitions in the Pipeline?

Are the big names in the Digital Assistant space looking to partner or acquire tools that can easily transform workflows to be leveraged by digital assistant?

IBM’s Conversations – chatbot dialog definition tool

Interactive Voice Response (IVR) solutions

APIs available on Mobile OS SDKs?

Are the components available for third party product companies to extend the Mobile OS capabilities as of now? Or are the mobile OS companies the only ones in a possession of performing these upgrades?

Over the last several months I’ve been researching Quantum Computing (QC) and trying to determine how far we’ve come from the theoretical to the practical implementation. It seems we are in the early commercial prototypical phase.

Practical Application of QC

The most discussed application of Quantum Computing has been to crack encryption. Encrypted data that may take months or years to decipher given our current supercomputing capabilities, may take hours or minutes when the full potential of Quantum Computing has been realized.

Bitcoin and Ethereum Go Boom

One source paraphrased: Once quantum computing is actualized, encryption will be in lockstep progress, and a new cryptology paradigm will be implemented to secure our data. This kind of optimism has no place in the “Real World”. and most certainly not in the world financial markets. Are there hedge funds which rightfully hedge against the cryptocurrency / QC risk paradigm?

Where is the Skepticism?

Is there anyone researching next steps in the evolution of cryptography/encryption, hedging the risk that marketplace encryption will be ready? The lack of fervor in the development of “Quantum Computing Ready” encryption has me speechless. Government organizations like DARPA / SBIR should already be at a conceptual level if not at the prototypical phase with next-generation cryptology.

Too Many Secrets

“Sneakers“, a classic fictional action movie with a fantastic cast, and its plot, a mathematician in secret develops the ultimate code-breaking device, and everyone is out to possess the device. An excellent movie soon to be non-fictional..?

The years seem to have flown by, and it’s that time again to complete my Continuing Certification Requirements for my PMP cert.

I randomly searched the web for PMP courses, then found myself back at PMI.org “Searching Activities”. Seems like the easiest way to lookup activities because they define the activities, and the correlated list of Professional Development Units, categorized by:

Technical

Leadership

Strategic & Business

Based on the activities I’ve already completed, my majority of work has been accomplished in the Technical category. I need to focus on attaining Leadership and Strategic & Business categories.

PMP 2019 Continuing Certification Requirements

Here are a few activities I thought were interesting, and took each one of these Online or Digital Media courses. Pluralsight provides an excellent set of courses at a relatively low price. I highly recommend Pluralsight for your learning needs. I also took a few of the LinkedIn courses and found it to be an excellent learning platform with a wide array of courses that can be applied as PDU credits.

If you have doubts choosing which methodology to use, this course will give you a comparison of Kanban and Scrum, making your choice easier. By watching this course you will learn how to take the best of both, Scrum and Kanban, and how to make a winning combination for your team and project.

Crisis communication is one of the most challenging communication types an organization or individual can face, bringing together emotional vulnerability, ethical challenges, and high-stakes decisions amplified by informational and persuasive goals. When managed well, this communication can neutralize and calm an evolving crisis. When managed poorly, though, crisis communication makes a situation worse. This course takes viewers through the most important parts of preparing for crisis communication, including understanding crisis types and strategies, preparing foundational documents, and how to create communication in the moment. By the end of the course, viewers will have a concrete understanding of how to manage crisis communication for their own organizations, providing invaluable insight and immediate benefit.

Are you a Scrum Master ready to advance your craft? This course will teach you specific strategies for coaching each member of your team and show you how to build on your experience as a Scrum Master to advance your own career to the next level.

Did you know that one of the most common reasons Scrum Teams fail is the lack of a skilled Product Owner? If you’ve suddenly found yourself in this role, this course will teach you how you can use the role to help your team deliver a great product.

This course will provide an in-depth understanding of Agile adaptive planning and value-driven delivery practices, requirements definition practices, as well as principles and practices related to stakeholder management. This course is part of the PMI-ACP Agile Project Management series.

Design thinking is a user-centered way of solving problems. It involves extensive collaboration, using strategies such as mapping customer journeys, concept creation, and prototyping. This course teaches leaders how to help their teams adopt a design thinking mindset, and provides examples from author Turi McKinley’s work at frog, a global design and strategy firm that transforms businesses at scale by creating systems of brand, product, and service.

Anyone who is anti “Big Brother”, this may not be the article for you, in fact, skip it. 🙂

The Pendulum Swings Away from GDPR

In the not so distant future, “Data Bank” companies consisting of Subject Matter Experts (SME) across all verticals, may process your data feeds collected from your purchase and user behavior profiles. Consumers will be encouraged to submit their data profiles into a Data Bank who will offer incentives such as a reduction of insurance premiums to cash back rewards.

Everything from activity trackers, home automation, to vehicular automation data may be captured and aggregated. The data collected can then be sliced and diced to provide macro and micro views of the information. On the abstract, macro level the information may allow for demographic, statistical correlations, which may contribute to corporate strategy. On a granular view, the data will provide “data banks” the opportunity to sift through data to perform analysis and correlations that lead to actionable information.

Is it secure? Do you care if a hacker steals your weight loss information?May not be an issueif collected Purchase and Use Behavior Profiles aggregate into a Blockchain general ledger. Data Curators and Aggregators work with SMEs to correlate the datainto:

Canned, ‘intelligent’ reports targeted for a specific subject matter, or across silos of data types

‘Universes’ (i.e. Business Objects) of data that may be ‘mined’ by consumer approved, ‘trusted’ third party companies, e.g. your insurance companies.

Actionable information based on AI subject matter rules engines and consumer rule transparency may be provided.

“Data Banks” may be required to report to their customers who agreed to sell their data examples of specific rows of the data, which was sold on a “Data Market”.

Consumers may have the option of sharing their personal data with specific companies by proxy, through a ‘data bank’ granular to the data point collected. Sharing of Purchase and User Behavior Profiles:

Targeted, affordable, medicine that may redirect the choice of the doctor to an alternate. The MD would be contacted to validate the alternate.

The curriated data collected may be harnessed by thousands of affinity groups to offer very discrete products and services. Purchase and User Behavior Profiles, correlated information stretches beyond any consumer relationship experienced today.

At some point, health insurance companies may require you to wear a tracker to increase or slash premiums. Auto Insurance companies may offer discounts for access to car smart data to make sure suggested maintenance guidelines for service are met.

You may approve your “data bank” to give access to specific soliciting government agencies or private firms looking to analyze data for their studies. You may qualify based on the demographic, abstracted data points collected for incentives provided may be tax credits, or paying studies.

Purchase and User Behavior Profiles: Adoption and Affordability

If ‘Data Banks’ are allowed to collect Internet of Things (IoT) device profile and the devices themselves are cost prohibitive. here are a few ways to increase their adoption:

[US] tax coupons to enable the buyer, at the time of purchase, to save money. For example, a 100 USD discount applied at the time of purchase of an Activity Tracker, with the stipulation that you may agree, at some point, to participate in a study.

Government subsidies: the cost of aggregating and archiving Purchase and Behavioral profiles through annual tax deductions. Today, tax incentives may allow you to purchase an IoT device if the cost is an itemized medical tax deduction, such as an Activity Tracker that monitors your heart rate, if your medical condition requires it.

Privacy and Data Protection Creates Data Markets

Initiatives such as General Data Protection Regulation (GDPR) and other privacy initiatives which seek to constrict access to your data to you as the “owner”, as a byproduct, create opportunities for you to sell your data.

Blockchain: Purchase, and User Behavior Profiles

As your “vault”, “Data Banks” will collect and maintain your two primary datasets:

As a consumer of goods and services, a Purchase Profile is established and evolves over time. Online purchases are automatically collected, curated, appended with metadata, and stored in a data vault [Blockchain]. “Offline” purchases at some point, may become a hybrid [on/off] line purchase, with advances in traditional monetary exchanges, and would follow the online transaction model.

User Behavior (UB) profiles, both on and offline will be collected and stored for analytical purposes. A user behavior “session” is a use case of activity where YOU are the prime actor. Each session would create a single UB transaction and are also stored in a “Data Vault”. UB use cases may not lead to any purchases.

Not all Purchase and User Behavior profiles are created equal. Eg. One person’s profile may show a monthly spend higher than another. The consumer who purchases more may be entitled to more benefits.

To train AI to think like a dog, the researchers first needed data. They collected this in the form of videos and motion information captured from a single dog, a Malamute named Kelp. A total of 380 short videos were taken from a GoPro camera mounted to the dog’s head, along with movement data from sensors on its legs and body.

They captured a dog going about its daily life — walking, playing fetch, and going to the park.

Researchers analyzed Kelp’s behavior using deep learning, an AI technique that can be used to sift patterns from data, matching the motion data of Kelp’s limbs and the visual data from the GoPro with various doggy activities.

The resulting neural network trained on this information could predict what a dog would do in certain situations. If it saw someone throwing a ball, for example, it would know that the reaction of a dog would be to turn and chase it.

The predictive capacity of their AI system was very accurate, but only in short bursts. In other words, if the video shows a set of stairs, then you can guess the dog is going to climb them. But beyond that, life is simply too varied to predict.

Dogs “clearly demonstrate visual intelligence, recognizing food, obstacles, other humans, and animals,” so does a neural network trained to act like a dog show the same cleverness?

It turns out yes.

Researchers applied two tests to the neural network, asking it to identify different scenes (e.g., indoors, outdoors, on stairs, on a balcony) and “walkable surfaces” (which are exactly what they sound like: places can walk). In both cases, the neural network was able to complete these tasks with decent accuracy using just the basic data it had of a dog’s movements and whereabouts.

Relational Database Solutions “In a Box”

Several of the relational database software vendors, such as IBM, Oracle, and Teradata have developed proprietary data warehouse software to be tightly coupled with server hardware to maximize performance. These solutions have been developed and refined as “on-prem” solutions for many years.

We’ve seen the rise of “Database (DW) as a Service” from companies like Amazon, who sell Redshift services.

Amazon Redshift is a fast, fully managed data warehouse that makes it simple and cost-effective to analyze all your data using standard SQL and your existing Business Intelligence (BI) tools. It allows you to run complex analytic queries against petabytes of structured data, using sophisticated query optimization, columnar storage on high-performance local disks, and massively parallel query execution. Most results come back in seconds.

RDB Complex Software/Hardware Maintenance

In recent times, the traditional relational database software vendors shifted gears to become service providers offering maximum performance from a solution hosted by them, the vendor, in the Cloud. On the positive side, the added complexity of configuring and tuning a blended software/hardware data warehouse has been shifted from the client’s team resources such as Database Administrators (DBAs), Network Administrators, Unix/Windows Server Admins,… to the database software service provider. The complexity of tuning for scalability, and other maintenance challenges shifts to the software vendor’s expertise, if that’s the abstraction you select. There is some ambiguity in the delineation of responsibilities with the RDBMS vendor’s cloud offerings.

Total Cost of Ownership

Quantifying the total cost of ownership of a solution may be a bit tricky, especially if you’re trying to quantify the RDBMS hybrid software/hardware “on-prem” solution versus the same or similar capabilities brought to the client via “Database (DW) as a Service”.

“On-Prem”, RDB Client Hosted Solution

Several factors need to be considered when selecting ANY software and/or Hardware to be hosted at the client site.

Infrastructure “when in Rome”

Organizations have a quantifiable cost related to hosting physical or virtual servers in the client’s data center and may be boiled down to a number that may include things like HVAC, or new rack space.

Resources used to maintain/monitor DC usage, there may be an abstracted/blended figure.

Database Administrators maintain and monitor RDB solutions.

Activities may range from RDB patches/upgrades to resizing/scaling the DB storage “containers”.

Application Database Admins/Developers may be required to maintain the data warehouse architecture, such as new requirements, e.g. creating aggregate tables for BI analysis.

Network Administrators

Firewalls, VPN

Port Scanning

Windows/Unix Server Administrators

Antivirus

OS Patches

Trying to correlate these costs in some type of “Apples to Apples” comparison to the “Data Warehouse as a Service” may require accountants and technical folks to do extensive financial modeling to make the comparison. Vendors, such as Oracle, offer fully managed services to the opposite end of the spectrum, the “Bare Metal”, essentially the “Infra as a Service.” The Oracle Exadata solution can be a significant investment depending on the investment in redundancy and scalability leveraging Oracle Real Application Clusters (RAC).

Support and Staffing Models for DW Cloud Vendors

In order for the traditional RDB software vendors to accommodate a “Data Warehouse as a Service” model, they may need to significantly increase staff for a variety of technical disciplines, as outlined above with the Client “On-Prem” model. A significant ramp-up of staff and the organizational challenges of developing and implementing a support model based on a variety of factors may have relational database vendors ask: Should they leverage a top tier consulting agency such as Accenture, or Deloitte to define, implement, and refine a managed service? It’s certainly a tall order to go from a software vendor to offering large scale services. With corporate footprints globally and positive track records implementing managed services of all types, it’s an attractive proposition for both the RDB vendor and the consulting agency who wins the bid. Looking at the DW Service billing models don’t seem sensical on some level. Any consulting agency who implements a DW managed service would be responsible to ensure ROI both for the RDS vendor and their clients. It may be opaque to the end client leveraging the Data Warehouse as a Service, but certainly, the quality of service provided should be nothing less than if implemented by the RDB vendor itself. If the end game for the RDB vendor is for the consulting agency to implement, and mature the service then at some point bring the service in-house, it could help to keep costs down while maturing the managed service.

Oracle Exadata

Here are URLs for reference to understand the capabilities that are realized through Oracle’s managed services.

I’ve been enamored with Bose products for well over a decade. However, we’ve seen quality brands enter the hi-fidelity audio market over that time. Beyond quality design in their classic audio products, can Bose Augmented Reality (Bose AR) be the market differentiator?

Bose: Using a Bose-AR-equipped wearable, a smartphone, and an app-enabled with Bose AR, the new platform lets you hear what you see.

It sounds like Bose may come up with an initial design, sunglasses, but turn to 3rd party hardware manufacturers of all sorts to integrate Bose AR into other wearable products.

Bose Augmented Reality isn’t just about audio. The devices will use sensors to track head motions for gesture controls and work with GPS from a paired smartphone to track location. The company also aspires to combine visual information with the Bose AR platform.

Bose AR Use Cases

The Bose and NFL partnership could be leveraged to get these AR units into the football player’s helmets. Audio queues from the on-field lead, quarterback, and dynamically replayed/relayed at the appropriate time of required action by the receiver.

Audio directions to your gate when your GPS detects that you’ve arrived at the airport, or any other destination from your calendar. Audio queues would be richer the more inclusive you are to the access to Calendars, To Do lists, etc.

Combine visual information with the Bose AR platform, too, so you could hear a translation of a sign you’re looking at.

Hear the history of a painting in a museum.

Time until it’s in consumer’s hands? TBD. Bose objective is to have the developer kit, including a pair of glasses, available later this year.

When I was on vacation in Athens, Greece, I created a post which had Greek actors running tours in their ancient, native garb. The Bose AR could be a complementary offering to the tour, which includes live, greek local actors portraying out scenes in ancient ruins. Record the scenes, and interact with them while walking through the Greek ruins in your Bose AR (Augmented Reality) glasses.

Takeaway

I’m a cheerleader for Bose, among several others in this space, but I question a Bose AR headset that produces a high fidelity sound. Most of the use cases listed should be able to “get along OK” with an average quality sound. Maybe high definition AR games with a high level of realism might benefit from the high-quality sound. However, their site reads like Bose is positioning themselves as a component to be integrated into other AR headsets, i.e. “Bose-AR-equipped wearable“