Tag Archives: Look

Nvidia Research today unveiled GauGAN, a generative adversarial AI system that lets you create lifelike landscape images that never existed.

The research and a demo of the new system were shown off today at the GPU Tech Conference (GTC) in San Jose, California.

GauGAN builds upon learnings from the Pix2Pix system introduced last year that can render virtual worlds, said Nvidia VP of applied deep learning research Bryan Catanzaro, but Pix2Pix can’t paint landscapes because doing so leaves artifacts in the resulting image.

GauGAN’s neural network is trained with one million open source Flickr images and imbued with an understanding of the relationship between more than 180 objects like snow, trees, water, flowers, bushes, hills, or mountains.

[embedded content]

That understanding of how objects relate to each other means a tree next to water will show a reflection, or when the season changes and there’s snow on the ground, trees will be depicted without leaves.

Style transfer is also possible so an image can adopt a warm sunset glow or display the cooler lights of a city skyline.

The GauGAN app uses a segmentation map, which acts sort of like a coloring book that describes where objects are but provides no detail, as seen below.

A paper by Nvidia principal research scientist Ming-Yu Liu and others that details the creation of GauGAN and its spatially adaptive denormalization method for photo manipulation was released today. The paper was also accepted for oral presentation at the Computer Vision and Pattern Recognition (CVPR) 2019 conference to be held in June in Long Beach, California.

Also making its debut today is the Nvidia AI Playground, a website where people can tinker with a variety of trained neural nets like GauGAN that use powerful AI to distort visuals or create lifelike images.

GauGAN is the latest reality-bending AI system from Nvidia, creator of deepfake tech like StyleGAN, which can generate lifelike images of people who never existed, and Nvidia open sourced last month.

Responding to a question of Nvidia releasing another system that can make people question what’s real, Catanzaro said that as a researcher, “this has been my dream” because the tech necessary to achieve such results represents progress — but on the other hand, it is tech that some people may misuse.

“Personally, I believe it’s a trust issue and not a tech issue,” Catanzaro said.

Catanzaro said he’s most excited about GauGAN being adopted by video game designers creating the landscape of virtual worlds and envisions applications of the technology for creatives who use storyboards to lay out their creations.

Pinterest is bringing full automation to Shop the Look, a feature that helps users buy products from companies that work with Pinterest, so you can, for example, buy a pair of jeans you see in a picture.

Previously, Shop the Look, which made its debut in 2017, was powered by artificial intelligence to find objects in Pinned photos that resemble products in stock from vendors, but included a human in the loop for curation. The automated Shop the Look will begin with the home decor category.

Altogether, Pinterest users have made more than 175 billion Pins. Full automation for Shop the Look has led to a 22.5x growth in Shop the Look coverage across Pins and products, and in early testing has led to a 7 percent lift in engagement with Shop the Look, a company spokesperson told VentureBeat in an email.

The process necessary to automate Shop the Look was explained in a Medium post by Pinterest engineer Kunlong Gu. It started with a dataset of 270,000 scene-product matching pairs and the Google Product Taxonomy (GPT).

“Initially we strictly followed GPT, but the model didn’t perform well in some coarse-grained categories (i.e. beddings, tables). We found that products in these coarse-grained categories are of vastly different shapes and functions,” Gu wrote. “For example, the category of ‘bedding’ include[s] bed canopies, bed sheets, pillows. We then manually cleaned up the existing datasets (fine-grained labeling) and improved the model significantly.”

Each scene is then processed using a recurrent convolutional neural network to parse each image into individual objects.

Automation frees up Pinterest staff from the need to curate Shop the Look and will inform future computer vision and ecommerce efforts at the company.

“In the long term, the scene images are great resources to learn the relationship between objects, i.e. what objects complement each other or go well together in a certain style. We hope to leverage this rich data of object occurrence and build a sophisticated object graph for every object in the world, making Pinterest a personalized stylist for home, fashion and more,” Gu said.

Will Smith has shared the first official look at Bad Boys 3 – officially titled Bad Boys for Life. Smith and Martin Lawrence team up once again as narcotics officers in Miami for the first time since 2003’s Bad Boys II.

In the Bad Boys franchise, Smith and Lawrence star as Detectives Mike Lowrey and Marcus Burnett, respectively. In the first film, the two are forced to singlehandedly investigate a heist they believe is an inside job; in the sequel, they’re tasked with stopping a Cuban drug lord who is fueling an MDMA epidemic, while Mike conceals the fact that he’s dating Marcus’ sister – an undercover DEA agent named Syd (Gabrielle Union). Both films were directed by Michael Bay, and with Bad Boys for Life, Bay is passing the torch to the directors Adil El Arbi and Bilall Fallah, who are also attached to Beverly Hills Cop 4.Bad Boys for Life will follow the two leads as they reconnect after a falling out in the midst of a hit put out on Mike and Marcus’ heads. Vanessa Hudgens, Alexander Ludwig, Charles Melton, DJ Khaled, and Joe Pantoliano also star.

In anticipation of Bad Boys for Life, Smith posted a Boomerang photo on his Instagram featuring Lawrence and himself standing side-by-side with badges draped around their necks in a rundown apartment. Smith is holding a handgun, and he captioned the photo with the text: “FIRST LOOK! Theeeeey’re BAAAAaaaack! 🙂 @badboys.” The song playing over the post is P. Diddy’s aptly titled “Bad Boy for Life.” Though the photo doesn’t give away any specifics regarding the plot, it’s the first official shot of Mike and Marcus back on the job on screen for the first time in sixteen years.

Aside from their own personal struggles, Mike and Marcus will also be contending with a rival police unit who reportedly attempts to one-up the Bad Boys duo. The team is made up of Hudgens, Ludwig, and Melton’s characters, though it’s unclear how big of a role they’ll play in the overall story, considering Mike and Marcus already have plenty on their plate given the personal vendetta they’re dealing with.

While it’s no doubt exciting for fans of the Bad Boys franchise to revisit these characters again, it’ll be interesting to see how audiences will respond to Mike and Marcus’ relationship so long after the first film was released back in 1995. That said, as far as success goes, Bumblebee recently proved that trading in Bay as director for someone new isn’t necessarily bad luck for a franchise. Bumblebee became the most critically-lauded film in the Transformers franchise and considering neither Bad Boys nor Bad Boys II were exactly a hit with critics, changing the formula a bit for Bad Boys for Life could prove to be successful.

Source: Screen Rant

Share this:

Like this:

Although we’re a few weeks into 2019, many of us are probably still scrambling to figure out how we can make our marketing efforts stand out and be effective as we enter a new year of doing business. Thankfully, we won’t be thrown many curveballs in the upcoming year. But while we can expect many of the marketing trends from years past to make an appearance this year, they will appear in ways that demand from us to better address the needs of our target customers.

In short, marketing is no longer a one-size-fits-all solution, and marketers will need to get more creative and personal with their approach in order to both attract and retain customers. These are a few of the ways we can expect that to manifest in 2019:

1. Customers will Expect Personalized Marketing

With data becoming more of a staple, customers are expecting you to know what they want more than ever before. And they expect you, the marketer, to deliver personalized marketing and solutions to meet those wants and needs. In plainer terms, the days of generic AdWords campaigns, social media posts and emails are over.

Take this for example: 52% of customers have said that they are likely to go to a competitor if they receive an email that is not personalized. That demonstrates that engagement and, therefore, your ROI can widely vary based on your ability to tailor your marketing efforts.

So how can marketers address this growing need for personalization? Businesses will need to focus on segmenting their audiences and make their outreach efforts more targeted than ever before.That means providing customers with relevant information based on their pain points and where they are at in the sales funnel. While that can seem overwhelming for us who are dealing with a multitude of customer personas, the right CRMs and marketing automation platforms can help you do that easily.

2. Your Content and How You Distribute it Will Need to Get Personal Too

Stats show that content marketing brings in 6x more conversions than other forms of marketing. But, creating content just for the sake of it is no longer enough. A study by Demand Metric found that personalized content is 80% more effective than that which is not.

That means that in 2019, marketers should let data and insights inform their content marketing plan, just like all their other marketing efforts. If a company is trying to reach distinct segments, its marketing team should aim to create at least some content pieces uniquely tailored to the needs and pain points of that target customer group.

Your distribution efforts will need to get more personal too. Marketers shouldn’t just depend on customers stumbling upon content pieces while browsing their website. You can get creative with your brand awareness efforts by posting on relevant blogs, channels, etc., to get the word out there where your target audience will see it.

3. Privacy and Security will Be A Priority

GDPR took many by surprise last year, and it seems like the change in how we do marketing is not over, since privacy and security laws are beginning to pop up in many countries all over the world. So, despite the need for personalization, consumers are very wary of who has their information and how they use it. And whether companies can meet that demand can affect trust and and their ability to comply with laws and regulations, so it should be a priority in the new year.

You can check out our post on privacy and security trends for 2019 for more details.

4. The Need for Closed-Loop Marketing Solutions Will Increase

We talked to Brian Aldrich, the vice president of paid search at Logical Position, about his predictions for digital marketing trends on a recent Rethink Marketing Podcast, and closed-loop solutions were at the top of his list. Collecting data in 2019 is simply not enough, marketers need to prioritize seeing how every step a customer takes impacts their overall customer journey, and use this information to analyze what marketing efforts are simply falling flat during the sales cycle. Take a listen to our podcast to learn more.

5. Businesses Will Need to Listen and Quickly Respond on Social

With today’s tracking tools, and features such as hashtags, it’s easier than ever to track mentions of your brand or company. So, in the upcoming year, don’t just think of how you can use social media to push out content. You should also consider how you can use social media to track unique insights and better understand the sentiment of your target customers.

Aside from using the information you gather to inform your future marketing efforts, you should aim to use social media to engage with your customers in a timely manner. Make it a priority to quickly respond to any complaints or questions that need immediate attention. And take advantage of opportunities to reach a wider audience, such as commenting on an influencer’s social post relevant to your business in order to bring visibility to your brand.

Privacy and security are two hot topics for the upcoming year, especially for marketers who are planning to have a wider reach, whether on a national or global level. Last year, we saw GDPR come into effect, which impacted the way businesses market to audiences in the EU, and we can expect more changes ahead.

But, if we’re to be honest, GDPR did so much more than change how we market and conduct business. It also taught many of us that we need to look and plan ahead for upcoming privacy and security issues. That way, we can be prepared and have time to focus on the work ahead instead of putting out fires.

So, for those of us who want to be prepared, what can we expect to see in 2019? To start, GDPR will continue to be a trending topic and growing concern for marketers in the upcoming year. We are also likely to see similar regulations pop up worldwide, and as a result privacy and security will become a priority for many of our organizations.

While dealing with the changes ahead may seem like an overwhelming task, today we’re going to help you take out some of the guesswork so you can hit the ground running in the new year. Here are a few ways we can expect privacy and security to affect the way we do business in 2019.

GDPR Will Become Even More Relevant

Think that EUs General Data Protection Regulation (better known as GDPR) is no longer relevant? Than think again! Many of us were caught off-guard when GDPR went into effect on May 24th of this year, and 2019 may be just as full of more changes. With the Data Protection Authorities (DPAs) in every member state of the EU getting fully staffed, businesses marketing in the EU can expect to see more things happen and the rules to become more obvious and defined in the upcoming year.

Get Ready for CCPA and Other Privacy Laws Around the U.S.

The California Consumer Privacy Act, which requires companies doing business in California to improve transparency in how they collect, use and share data, will go into effect January 2020. However, companies should be prepared to comply by June 1, 2019. Many of us who market to a US-based audience have been aware of and are preparing for CCPA, but this regulation may just be the beginning of how we can expect to market to customers around the United States.

Similar laws seem to be popping around the United States, with Vermont expected to put a data vendor law in place and New Jersey considering legislation that will change data privacy as well. Although all these various privacy regulations may seem like a hassle to deal with, there may be a push to create a federal data bill which will preempt state bills and result in more general guidelines across the US.

Look Out For Privacy Laws to Pop Up Worldwide

Marketers can expect to see more regulations inspired by GDPR to be implemented in countries all over the world. Currently 12 countries are considering similar regulations to those of GDPR, including Brazil, Chile, Uruguay, Argentina, Japan, Thailand, India, Australia, China and South Korea. Although the proposed changes are very similar to GDPR, they have slight nuances, which means marketers who do business worldwide will have to be knowledgeable and careful about how they approach each audience.

All this Means An Increased Need for Security

On top of improving transparency around the way we collect and use data, companies have to be prepared against data breaches to keep our customers’ information secure.

In the upcoming year, it will become more crucial for vendors that use CRM data integration to not only help companies comply with regulations such as GDPR and the CCPA, but also equip businesses to deal with any security issues that may arise. For example, we’ve designed our platform to ensure that our customer’s data is fully protected. And, in addition, we make it so that you can easily build compliance forms to help you ensure you’re meeting regulation.

Although complying with these various regulations regarding privacy and security may be difficult at first, we can only expect the outcomes to benefit our business. Improving transparency about the way we collect and use data, and making an effort to protect our customers overall, only helps to enhance their trust and loyalty in our business.

After all, our job as marketers is to ensure that customers feel like their best interest is always at the top of our mind, and reassure them that choosing us is better than doing business with our competitors. And if investing time and effort in privacy and security ensures that is the case, why not do it? Especially if it leads to more opportunities and closed deals in the future.

For decades, games have served as benchmarks for artificial intelligence (AI).

In 1996, IBM famously set loose Deep Blue on chess, and it became the first program to defeat a reigning world champion (Garry Kasparov) under regular time controls. But things really kicked into gear in 2013 — the year Google subsidiary DeepMind demonstrated an AI system that could play Pong, Breakout, Space Invaders, Seaquest, Beamrider, Enduro, and Q*bert at superhuman levels. In March 2016, DeepMind’s AlphaGo won a three-game match of Go against Lee Sedol, one of the highest-ranked players in the world. And only a year later, an improved version of the system (AlphaZero) handily defeated champions at chess, a Japanese variant of chess called shogi, and Go.

The advancements aren’t merely advancing game design, according to folks like DeepMind cofounder Demis Hassabis. Rather, they’re informing the development of systems that might one day diagnose illnesses, predict complicated protein structures, and segment CT scans. “AlphaZero is a stepping stone for us all the way to general AI,” Hassabis told VentureBeat in a recent interview. “The reason we test ourselves and all these games is … that [they’re] a very convenient proving ground for us to develop our algorithms. … Ultimately, [we’re developing algorithms that can be] translate[ed] into the real world to work on really challenging problems … and help experts in those areas.”

With that in mind, and with 2019 fast approaching, we’ve taken a look back at some of 2018’s AI in games highlights. Here they are for your reading pleasure, in no particular order.

Montezuma’s Revenge

In Montezuma’s Revenge, a 1984 platformer from publisher Parker Brothers for the Atari 2600, Apple II, Commodore 64, and a host of other platforms, players assume the role of intrepid explorer Panama Joe as he spelunks across Aztec emperor Montezuma II’s labyrinthine temple. The stages, of which there are 99 across three levels, are filled with obstacles like laser gates, conveyor belts, ropes, ladders, disappearing floors, and fire pits — not to mention skulls, snakes, spiders, torches, and swords. The goal is to reach the Treasure Chamber and rack up points along the way by finding jewels, killing enemies, and revealing keys that open doors to hidden stages.

Montezuma’s Revenge has a reputation for being difficult (the first level alone consists of 24 rooms), but AI systems have long had a particularly tough go of it. DeepMind’s groundbreaking Deep-Q learning network in 2015 — one which surpassed human experts on Breakout, Enduro, and Pong — scored a 0 percent of the average human score of 4,700 in Montezuma’s Revenge.

Researchers peg the blame on the game’s “spare rewards.” Completing a stage requires learning complex tasks with infrequent feedback. As a result, even the best-trained AI agents tend to maximize rewards in the short term rather than work toward a big-picture goal — for example, hitting an enemy repeatedly instead of climbing a rope close to the exit. But some AI systems this year managed to avoid that trap.

DeepMind

In a paper published on the preprint server Arxiv.org in May (“Playing hard exploration games by watching YouTube“), DeepMind described a machine learning model that could, in effect, learn to master Montezuma’s Revenge from YouTube videos. After “watching” clips of expert players and by using a method that embedded game state observations into a common embedding space, it completed the first level with a score of 41,000.

In a second paper published online the same month (“Observe and Look Further: Achieving Consistent Performance on Atari“), DeepMind scientists proposed improvements to the aforementioned Deep-Q model that increased its stability and capability. Most importantly, they enabled the algorithm to account for reward signals of “varying densities and scales,” extending its agents’ effective planning horizon. Additionally, they used human demonstrations to augment agents’ exploration process.

In the end, it achieved a score of 38,000 on the game’s first level.

OpenAI

Above: An agent controlling the player character.

Image Credit: OpenAI

In June, OpenAI — a nonprofit, San Francisco-based AI research company backed by Elon Musk, Reid Hoffman, and Peter Thiel — shared in a blog post a method for training a Montezuma’s Revenge-beating AI system. Novelly, it tapped human demonstrations to “restart” agents: AI player characters began near the end of the game and moved backward through human players’ trajectories on every restart. This exposed them to parts of the game which humans had already cleared, and helped them to achieve a score of 74,500.

In August, building on its previous work, OpenAI described in a paper (“Large-Scale Study of Curiosity-Driven Learning“) a model that could best most human players. The top-performing version found 22 of the 24 rooms in the first level, and occasionally discovered all 24.

What set it apart was a reinforcement learning technique called Random Network Distillation (RND), which used a bonus reward that incentivized agents to explore areas of the game map they normally wouldn’t have. RND also addressed another common issue in reinforcement learning schemes — the so-called noisy TV problem — in which an AI agent becomes stuck looking for patterns in random data.

“Curiosity drives the agent to discover new rooms and find ways of increasing the in-game score, and this extrinsic reward drives it to revisit those rooms later in the training,” OpenAI explained in a blog post. “Curiosity gives us an easier way to teach agents to interact with any environment, rather than via an extensively engineered task-specific reward function that we hope corresponds to solving a task.”

On average, OpenAI’s agents scored 10,000 over nine runs with a best mean return of 14,500. A longer-running test yielded a run that hit 17,500.

Uber

OpenAI and DeepMind aren’t the only ones that managed to craft skilled Montezuma’s Revenge-playing AI this year. In a paper and accompanying blog post published in late November, researchers at San Francisco ride-sharing company Uber unveiled Go-Explore, a family of so-called quality diversity AI models capable of posting scores of over 2,000,000 and average scores over 400,000. In testing, the models were able to “reliably” solve the entire game up to level 159 and reach an average of 37 rooms.

To reach those sky-high numbers, the researchers implemented an innovative training method consisting of two parts: exploration and robustification. In the exploration phase, Go-Explore built an archive of different game states — cells — and the various trajectories, or scores, that lead to them. It chose a cell, returned to that cell, explored the cell, and, for all cells it visited, swapped in a given new trajectory if it was better (i.e., the score was higher).

This “exploration” stage conferred several advantages. Thanks to the aforementioned archive, Go-Explore was able to remember and return to “promising” areas for exploration. By first returning to cells (by loading the game state) before exploring from them, it avoided over-exploring easily reached places. And because Go-Explore was able to visit all reachable states, it was less susceptible to deceptive reward functions.

The robustification step, meanwhile, acted as a shield against noise. If Go-Explore’s solutions were not robust to noise, it robustified them into a deep neural network with an imitation learning algorithm.

“Go-Explore’s max score is substantially higher than the human world record of 1,219,200, achieving even the strictest definition of ‘superhuman performance,’” the team said. “This shatters the state of the art on Montezuma’s Revenge both for traditional RL algorithms and imitation learning algorithms that were given the solution in the form of a human demonstration.”

Like this:

LikeLoading…

About Krisgo

I’m a mom, that has worn many different hats in this life; from scout leader, camp craft teacher, parents group president, colorguard coach, member of the community band, stay-at-home-mom to full time worker, I’ve done it all– almost! I still love learning new things, especially creating and cooking. Most of all I love to laugh! Thanks for visiting – come back soon

This series of eight blogs is intended for those considering a transition to “intelligent financial planning and analysis” – essentially, modernizing solutions and systems to support a dynamic approach to analytics. Here, we’ll examine the key elements of an intelligent FP&A solution.

Traditional FP&A solutions are typically focused on one aspect of the management process – for example, setting a budget, collecting a forecast, or delivering results in the form of a report pack. Over the past 20 years, there have been concerted efforts to combine these processes into a single system. After all, what’s the point of a budget if you can’t report against it or collect a forecast to see if year-end goals will be met?

The trouble is that these systems often have a single view of the business, which is financially oriented and structured according to the organizational hierarchy. While this does have some value, it typically does not include the real-world view, which is far more complex.

To get around this, “satellite” systems are developed for analyses such as sales, market share, and even the impact of social media. These results are then brought together in the form, or yet another system such as a dashboard that is tentatively linked to the organization’s adopted strategy methodology. (Whatever happened to the Balanced Scorecard, Hoshin Planning …?). If all else fails, there is always Excel, which gives end users free range for producing their own analyses, but at a price that usually forsakes data integrity.

For the record, this is not an integrated solution and does not meet the definition of iFP&A.

IFP&A solution components

The following components are essential in any modern iFP&A system:

Common platform: iFP&A solutions recognize that there is no single “off-the-shelf” application. Instead, they encompass a mixture of technologies built on a common platform that enables the sharing of data and metadata. This is a key point.

Model builder: “Models” are typically a collection of mathematically based variables/structures that reflect the way an organization operates. They can include ”adding-up” models that you typically find in a budget solution, or “driver-based” where entering a number into a variable (e.g., sales volume), then generates a range of associated variables (revenue, cost of sales, delivery, etc.) by the application of preset formulae. To represent the complete organization, multiple models will be required that will typically include:

Past sales by product, customer, region, size, etc.

Financial resource allocation and funding

Long-range plan and associated targets

Detailed forecasts

Strategic initiatives and projects

These models are designed to answer specific questions and will have different content and structures, and are used by different people at different times. They can include both structured and unstructured data, including text.

Common data and metadata: An iFP&A solution will allow these models to be built from common metadata (business structures, variable definitions) and allow common data to be entered once and automatically shared between models. Any changes to common items are automatically reflected wherever they are used.

Data acquisition tools: All data within an iFP&A solution is common, but these tools allow external data and metadata to be accessed, summarized, and transformed into the established models.

Data analysis and optimization tools: These tools can be applied to any model or data set to find trends and correlations, and used to generate data without having to embed rules within the models.

Collaboration and workflow: These capabilities are role-specific, allowing users to see what actions they need to take, by when; automation triggers activities based on events and exceptions. Typically, users can collaborate by using tools such as comment functions

Reporting: Users can “grab” data from any model and any time period and display it in any format, whether traditional report, dashboard, or email alert.

End-user analysis: These tools allow unfettered access for analysis, provided that the user has the right security clearance. These analyses can be saved, shared, and accessed by a variety of hardware devices.

Cloud enablement:Intelligent cloud-based solutions enable user access from any location and any device, security permitting. This frees users from having to be at a certain location or use specific devices.

In many ways, there is nothing new in these capabilities individually. What makes iFP&A different is that these capabilities are all tightly integrated without the need for users to know and learn different technologies.

Together, these components greatly speed up the collection and analysis of data and pave the way to continuous planning driven by exceptions and alerts.

Stay tuned for the next blog in this series, which examines emerging technologies for solutions that enable FP&A to transform business performance.

The trend of DevOps has really picked up speed. Today, we’ll answer the question “what is DevOps?” and take a look at how businesses are using DevOps methodology stay ahead of the competition.

What is DevOps?

DevOps is a combination of the terms development and operations. DevOps is an extension of the agile development philosophy to promote faster and more efficient development and deployment of software.

Many businesses are adopting this approach because it helps them stay competitive by bringing new product features (or bug fixes) to market much faster. DevOps methodology advocates for automation and monitoring across steps in the process of software production – from integration, QA testing, deployment to production and infrastructure management. The result is shorter development cycles, more dependable software releases and increased deployment frequency.

Agile, DevOps & Continuous Delivery

Agile software development follows a philosophy combining the ideas of collaboration, adaptive planning and continual improvement to rapidly respond to feedback and evolving requirements. Seeing success using the Agile approach, organizations wanted to release the software faster and more frequently, giving birth to continuous delivery and the DevOps culture.

DevOps and continuous delivery are often used interchangeably because of their shared goal to speed up software deployment, but there is a subtle difference between the two. Continuous delivery is focused on automating software delivery processes. DevOps takes this one step further to also break down organizational silos for greater collaboration between the many functional areas that have a hand in this process.

DevOps and the Mainframe

Many business-critical applications rely on mainframes. If you want to efficiently deploy and maintain those applications, you need to make mainframes part of your DevOps workflow.

In our recent interview with Trevor Eddolls, he asserted that DevOps offers the greatest opportunity for organizations to get the most of our their mainframe investment. “For organizations that haven’t looked at DevOps and Agile computing, or are unaware of the fact that RESTful protocols work with IMS and CICS, this kind of modernization will bring them the greatest advantages in terms of growth and improved service.”

Organizations that do DevOps most effectively understand that, while technologies like Docker are one important component, an DevOps-optimized workflow involves all parts of an organization’s infrastructure.

Breaking down the mainframe silo

If you have a mainframe, there’s a good chance that your mainframe is one of the biggest silos inside your organization. That’s because, by default, mainframe data is very disconnected from the rest of your infrastructure. It exists in formats that are difficult to convert and use with modern analytics tools. It takes a long time to offload and is expensive to store for long periods

Fortunately, things don’t have to be this way. You can also de-silo your mainframe data by bridging the gap between your mainframe and the rest of your infrastructure.

How DevOps has changed mainframe careers

The DevOps revolution is meaningfully changing the job descriptions for the mainframe experts they hire. The idea that constant collaboration should be a central feature of IT workflows, and that team members should be prepared to coordinate with one another as much as possible, is now deeply embedded in the way most businesses organize their approach to software delivery.

Mainframe programmers are now required to branch out from roles strictly programming roles to also participate in administrative tasks. It also means that programmers and system administrators who specialize in mainframes need to collaborate more closely with the rest of the IT organization.

This greater collaboration requires a working understanding of other systems – as well as a preparedness to integrate information and resources quickly between mainframes and other types of environments. (In other words, you can’t pitch yourself as solely a mainframe engineer anymore.)

DevOps… and Big Data?

You’ll notice that the description of DevOps and continuous delivery didn’t mention data. And it’s true that, by most conventional definitions, it is not closely linked to Big Data. But new ideas have recently emerged to bring them together.

Integrating Big Data into DevOps

If the goal of DevOps is to make software production and delivery more efficient, then including data specialists within the continuous delivery process can be a big win for organizations working to embrace DevOps. By integrating Big Data, organizations can achieve more effective software update planning, lower error rates, closer alignment between development and production environments and more accurate feedback from production.

Applying these principles to data management

DevOps focuses on software production, and at first glance might not seem to offer much to people working with data. But data specialists have much to learn from the movement.

But upon closer inspection, we notice that the data management process is similar to software production in that both involve multiple teams. A group to set up data storage, another to run the database and a third to work on analytics, plus a security team to keep the data safe and enforce compliance policies.

Traditionally, these different teams have not always collaborated closely. The folks who set up MySQL databases usually don’t know much about using Hadoop, for example. By embracing the core ideas of DevOps, however, organizations can achieve DataOps to make these different teams collaborate more effectively.

Check out these related articles:

Has your organization adopted DevOps methodologies that integrates your mainframe? Have you applied these principles to your data management strategy? See how Syncsort Integrate products can help you break down the silos.

If you’re like most Dynamics CRM users in B2B sales, you lean on your CRM for just about everything: tracking leads and opportunities, consolidating collateral, contracts, and other documents, tracking events and tasks, storing contacts, etc.

We show much love to these steps and requirements in our sales process.

Sales reporting and analytics: if it’s not in a report, did it even happen…?

Flexible product and pricing configuration: an expectation, but we’re calling it out just in case.

Support: focus on your line of business, and choose a vendor that delivers CPQ from soup to nuts.

Scalability and value: ok, so that makes it seven things to look for: ensure flexible pricing (no annual licenses) and know the deeper meaning of value.

Why single sign-on matters

Yes, security. Yes, simplicity. But in our experience, it’s accessibility as much as anything else. The easier a sales tool is to find and use, the more likely your reps will use it. Single sign-on integration means your proposal tool is in the same “toolbox” (your CRM) as the rest of your sales tools.

Why templates matter

Repeatable processes, cleaner looking proposals, faster creation of proposals. Unless each one your reps is both a gifted designer AND a world class copywriter, try to have the majority of your sales quotes “templated.”

Why reporting — on quotes — matters

Sales is half selling, and half post-game reporting. But if your analytics extend into real-time activities around your quotes (this one needs a reply, that one’s moving along nicely), you can spot bottlenecks and ensure “post-game” is “post closed-won.”

Why product pricing configuration matters

We called out the benefit of flexibility (being able to quickly create and disseminate winning bundles, for example), but it’s also about consolidation. CPQ ensures your reps are all working from the same playbook, no matter where they are.

Why support matters

As with templates, if you have the expertise in-house (in this case, engineers versed in implementing and managing CPQ tools), use it! But as most sales people have expertise in sales over software development, ensure your CPQ partner offers every bit of support you need.

Why scalability and value matter

Seasonal sales, rapid growth, mergers/acquisitions — the business cases for ensuring your business software is scalable are known to all. Re: value — the only thing to remember is that it’s not just a number. A good price is great, but a great partner is better.

Want to get started in automating your sales proposal process? Take a free tour.