Voice captis executive search management consulting leadership board services

The difference between CX and DX and why they matter in ecommerce

Voice captis executive search management consulting leadership board services

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


This post was written by David Fletcher, a senior vice-president at ClearSale.

We talk a lot about CX (customer experience) and DX (digital transformation) in ecommerce, especially since the pandemic pushed retailers to focus more on digital channels and customers encountered some rocky experiences during the abrupt shift. Both CX and DX are crucial to merchant success, but digital transformation and customer experience aren’t synonymous. Here’s a look at the current state of CX and DX, why they matter so much right now, and how your store can tailor its DX to support better CX.

CX and DX, defined

Customer experience, or CX, is the way customers feel about a brand, based on all the interactions they’ve had with the brand’s marketing, salespeople, products and services, and support. For example, your social media posts, how your sales associates greet customers, how your products hold up over time, and how long customers have to wait for a response from your support team are all part of the customer experience.

CX has become much more important to brands over the past decade. Slightly more than a third of businesses said they competed on CX in 2010. Now, 79% of consumers said in a Salesforce report the experience they have with a company matters as much as the company’s products and services, and that makes customer experience one of the most important competitive differentiators today.

Digital transformation, or DX, meanwhile, is the use of new technologies to update and optimize your business processes and the customer experience you provide. Many retailers had been taking a slow approach to DX before the pandemic and had to catch up quickly when lockdowns and public health concerns shut down or reduced in-person shopping. For example, many grocery stores added or scaled up online ordering for delivery or curbside pickup, while many ecommerce retailers added real-time delivery tracking tools to help customers avoid losing their purchases to the rising tide of “porch pirate” thefts.

There’s definitely a person-to-person element to great customer experience, like when an airline gate agent bumps you up to business class or a clothing boutique offers you personal shopping services. But as more of us shop online, the digital aspects of CX are overwhelmingly important. Salesforce surveyed thousands of customers in mid-2020 and found that 88% expect companies to speed up their CX because of the pandemic.

Key DX elements for better CX

Many of the digital upgrades that merchants adopted in 2020 were in direct response to changing customer needs—being able to shop online and get the items they purchased. But digital tools can also help with less urgent but still important elements of the customer experience, including these areas:

Personalization at all touchpoints, regardless of channel.

More than half (52%) of customers expect offers from brands to “always be personalized,” according to Salesforce data. This level of personalization requires collecting and analyzing customer’s data across all interactions to show them the products they’re interested in, when they want to see them—and avoid recommendation missteps, such as showing them items they’ve already bought.

Real-time information about stock status.

Stock-outs were a huge problem for much of 2020 as supply chains faltered while consumers stocked up first on essential goods and then seemed to shop in waves for items like wading pools, bicycles, and laptops. Now, 80% of customers say they’re already using or would like to use tools to pre-order items that aren’t currently in stock, per Salesforce. Showing customers correct stock levels and giving them ways to pre-order require unified stock data and pre-order options during checkout.

Easy-to-reach customer service.

Customers overwhelmingly expect to engage with someone right away when they reach out to a company for help. Slow or unhelpful responses can drive them away for good. The DX solution includes a single view of the customer that service representatives can see while they’re on a call or chat, so they don’t have to ask the customer to re-explain their problem or question. Another digital solution for better service is a customer service chatbot with natural language processing capabilities that can understand conversations and generate personalized responses.

Secure, low-friction payment options.

Digital technologies make it possible to streamline the checkout process for customers, so they don’t have to key in all their personal data for every order. For example, adding digital wallet capabilities to your website—either your own digital wallet or a third-party option like Apple Pay or PayPal—allows your shoppers to buy without having to take out their credit card and key in the data. That matters, since 44% of consumers have abandoned online purchases because the checkout process was too long or too complicated, according to a March 2020 Sapio Research survey conducted for ClearSale.

Accurate, low-friction customer authentication.

The Sapio survey also found that nearly 3 times as many customers would abandon a merchant for good after a false decline (39%) than after a fraud experience (13.6%) with that merchant. This is critical for the many merchants who rely on automated fraud screening, because those systems often reject good orders that may resemble attempted fraud in some way. The solution is digital: AI and ML-driven automated order screening coupled with manual review of flagged orders. Reviews can reduce the number of false declines and feed their findings back into the algorithm, so it gets better at telling fraud from good orders and reduces the risk of alienating good customers.

It’s time to transform for better CX

Adopting new technology now can help retailers catch up to competitors who were able to pivot faster during the pandemic and are now setting the standard for customer experience. Merchants that get up to speed with their digital transformation now will also be in a better position to handle whatever else comes along that requires brands to adjust while still delivering a great customer experience.

David Fletcher serves as Senior Vice President at ClearSale, a card-not-present fraud prevention operation that helps retailers increase sales and eliminate chargebacks before they happen. As a serial entrepreneur, he understands the particular pain points that affect business owners today, and how fraud management can provide real-world solutions to those problems. At ClearSale, he spearheads business development, sales, partnerships and alliances with top e-commerce organizations.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

ai machine learning shutterstock 671158525 captis executive search management consulting leadership board services

When AI meets BI: 5 red flags to watch for

ai machine learning shutterstock 671158525 captis executive search management consulting leadership board services

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


If there is a Holy Grail to business success in 2021, it most certainly has something to do with the power of data. And if business intelligence wasn’t already the answer to C-suite prayers for insight and vision derived from massive amounts of data, then artificial intelligence would be.

Today, AI is widely seen as the enabler that BI has always needed to take it to next-level business value. But incorporating AI into your existing BI environment is not so simple.

And it can be precarious, too: AI can dramatically amplify any almost unnoticeable issue into a significantly larger — and negative — impact onto downstream processes. For example, if you can’t sing in tune but only play with a small karaoke machine at home, there is no risk. It’s not a big problem. But imagine yourself in a huge stadium with a multi-megawatt PA system.

With that amplification potential firmly in mind, organizations hoping to integrate AI into their BI solutions must be keenly aware of the red flags that can scuttle even the best-intentioned projects.

Major pitfalls to watch out for when integrating AI into BI include the following:

1. Misalignment with (or absence of) business use case

This should be the easiest pitfall to recognize — and the most common found among current AI implementations. As tempting as it may be to layer AI into your BI solution simply because your peers or competitors are doing so, the consequences can be dire. It may prove difficult to justify the ROI if you spend millions of dollars to automate a piece of work done by a single employee earning $60,000 per year. Here, the positive ROI does not look obvious.

When seeking input from business leaders about the potential viability of an AI-enabled BI, start the conversation with specific scenarios where AI’s scale and scope could potentially address well-defined gaps and yield a business value exceeding the estimated expense. If those gaps aren’t well-understood or the sufficient new value is not guaranteed, then it’s difficult to justify proceeding further.

2. Insufficient training data

Let’s assume that you have a feasible business case — what should you look out for next? Now, you need to make sure that you have enough data to train AI via a machine learning (ML) process. You may have tons of data, but is it enough to be used for AI training? That will depend on a specific use case. For example, when Thomson Reuters built a Text Research Collection in 2009 for news classification, clustering, and summarization, it required a huge amount of data — close to two million news articles.

If at this point you’re still wondering who can determine what the right training data is, and how much of it will be enough for the intended use case, then you’re facing your next red flag.

3. Missing AI teacher

If you have an outstanding data scientist on-staff, it does not guarantee that you already have an AI teacher. It’s one thing to be able to code in R or Python and build sophisticated analytical solutions, and quite another to identify the right data for AI training, to package it properly for the AI training, to continuously validate the output, and guide AI in its learning pathway.

An AI teacher is not just a data scientist – it’s a data scientist with a lot of patience to go through the incremental machine learning process, with a thorough understanding of the business context and the problem you’re trying to solve, and an acute awareness of the risk of introducing bias via the teaching process.

AI teachers are a special breed, and AI teaching is increasingly considered to be at the intersection of artificial intelligence, neuroscience, and psychology — and they may be hard to find at the moment. But AI does need a teacher: Like a big service dog, such as a Rottweiler, with the proper training it can be your best friend and helper, but without one it could become dangerous, even for the owner.

If you are lucky to get an AI teacher, you still have a couple of other concerns to consider.

4. Immature master data

Master data (MD), the core data that underpins the successful operation of the business, is critically important not only for AI, but for traditional BI as well. The more mature or well-defined that MD is, the better. And while BI can compensate for MD’s immaturity inside a BI solution via additional data engineering, the same cannot be done inside AI.

Of course, you can use AI to master your data, but that is a different use case, known as data preparation for BI and AI.

How can we tell so-called mature MD from immature MD? Consider the following:

  1. The level of certainty in deduplication of MD Entities — it should be close to 100%
  2. The level of relationship management:
    • Inside each MD entity class — for example, “Company_A-is-a-parent-of-Company_B”
    • Across MD entity classes — for example, “Company_A-supplies-Part_XYZ”
  3. The level of consistency of categorizations, classifications, and taxonomies. If the marketing department uses a product classification that is different from the one used in finance, then these two must be properly — and explicitly — mapped to one another.

If you have mastered the above A-B-C of your MD and have successfully moved through the preceding three “red flag” check points, then you can attempt relatively simple use cases of enhancing BI with AI — the ones that use structured data.

If unstructured data, such as free-form text or any information without a pre-defined data model, must be involved in your AI implementation, then watch out for red flag #5.

5. Absence of a well-developed knowledge graph

What is a knowledge graph? Imagine all your MD implemented in a machine-readable format with all the definitions, classes, instances, relationships, and classifications, all interconnected and queryable. That would be a basic knowledge graph. Formally speaking, a knowledge graph includes an information model (at the class level) along with corresponding instances, qualified relationships (defined at the class level and implemented at the instance level), logical constraints, and behavioral rules.

If a knowledge graph is implemented using Semantic Web standards, then you can load it straight into AI, thereby significantly minimizing the AI teaching process described earlier. Another great feature of a knowledge graph is that it is limitlessly extensible in terms of the informational model, relationships, constraints, and so on. It is also easily mergeable with other knowledge graphs. If mature MD may be sufficient for AI implementations using only structured data, then a knowledge graph is a must for:

  • AI solutions processing unstructured data — where the AI uses the knowledge graph to analyze the unstructured data in a similar way as structured data;
  • AI storytelling solutions — where the analytical results are presented as a story or a narrative, not just tables or charts, thereby shifting BI from an on-screen visualization supporting the discussion at the table, to a party of this discussion; a cognitive support service, if you will.

While these potential pitfalls seem daunting at first, they are certainly less worrisome than the alternative. And they serve as a reminder of the best practice common to all successful AI implementations: up-front preparation.

AI can change BI from an insights-invoking tool to a respected participant in the actual decision-making process. It is a qualitative change — and it may be a highly worthwhile investment, as long as you know what to watch out for.

Igor Ikonnikov is a Research & Advisory Director in the Data & Analytics practice at Info-Tech Research Group. Igor has extensive experience in strategy formation and execution in the information management domain, including master data management, data governance, knowledge management, enterprise content management, big data, and analytics.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

intel tiger lake h captis executive search management consulting leadership board services

How Intel is using Tiger Lake-H to help make gaming laptops people love | How Games Make Money

intel tiger lake h captis executive search management consulting leadership board services

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Did you miss GamesBeat Summit 2021? Watch on-demand here! 


On Tuesday, Intel revealed its new Tiger Lake-H CPUs for gaming laptops. These chips include hyperthreaded 6 and 8-core processors that are ideal for high-end gaming performance in a thin and light notebook. And this was always the goal, according to Intel gaming-notebook product manager Joakim Algstam. I had a conversation with Algstam that you can listen to on the player below or on Apple Podcast, Google Podcasts, or Spotify.

“It’s what we have been working toward since the beginning of the segment,” Algstam told GamesBeat. “As we balance platforms together with our OEM partners, we want to make sure that the CPU and the GPU and coexist inside these thin form factors with the best thermal Solutions possible.”

To that end, Intel recently highlighted Alienware’s upcoming X17 gaming notebook. The X17 uses new thermal interface material and a quad-fan setup to avoid temperature throttling.

And extreme cooling is necessary, even as Tiger Lake-H improves efficiency. That’s because Intel and its partners continue to push the limits on how much hardware they can cram into slim devices.

Algstam points to examples where device can now run high-end CPUs and GPUs in a sub-25 millimeter laptop with only a single power supply. A similar gaming notebook would’ve required two power supplies just a few years ago.

“I remember back in 2011, we had a project that focused on the thin enthusiast form factor, and we wanted to get below 20 millimeters, and at that time most laptops were muscle books and they were 50 millimeters thick and people looked at us like we were crazy,” said Algstam. “It was like a barrier that we wanted to reach and two years later [we announced] that we had a teraflop compute inside a 20 millimeter form factor and from there we were off to the races.”

Now, the enthusiast gaming laptop below 20mm is Intel’s fastest-growing segment at 45% growth year-over-year. And it’s precisely because a 20mm, powerful laptop can fit into people’s lives so well.

“What we have found is that people want portability in their own house,” said Algstam. “It’s not so much about taking the laptop out of the house. It’s wanting to sit in the den and then on the couch in front of my TV with my laptop and these platforms allow you to bring that capability.”

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

investgame 2 captis executive search management consulting leadership board services

The DeanBeat: The FOMO over the decline of triple-A games is unwarranted

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Did you miss GamesBeat Summit 2021? Watch on-demand here! 


“I must not fear.
Fear is the mind-killer.
Fear is the little-death that brings total obliteration.
I will face my fear.
I will permit it to pass over me and through me.
And when it has gone past I will turn the inner eye to see its path.
Where the fear has gone there will be nothing. Only I will remain.”

— Frank Herbert, the litany against fear in Dune.

We had another panic this week about the decline of triple-A video games, and it showed that we have a lot of fear of missing out as fans. But I think some of this fear is based on a misunderstanding about the industry’s unique status as both a business and an art form. Hardcore gamers like the art form, while business people want to get rich from it. They don’t always trust each other’s motivations.

Ubisoft’s chief financial officer Frederick Duguet set off the panic among hardcore gamers when he said in an earnings call that putting out three or four triple-A games is not “a proper indication of [Ubisoft’s] value-creation dynamics.” Instead, Ubisoft expects to make generate more revenue from free-to-play live-service games. And so it had announced The Division: Heartland, a free-to-play shooter. Many fans took Duguet’s comments to mean that Ubisoft is going to make fewer triple-A games. So Ubisoft’s PR department had to intercede with a clarification the next day.

“Our intention is to deliver a diverse line-up of games that players will love – across all platforms. We are excited to be investing more in free-to-play experiences, however we want to clarify that this does not mean reducing our AAA offering,” a spokesperson said in a statement. “Our aim is to continue to deliver premium experiences to players such as Far Cry 6, Rainbow Six Quarantine, Riders Republic and Skull & Bones to name a few while also expanding our free-to-play portfolio and strengthening our brands to reach even more players.”

In other words, Ubisoft reassured fans that it’s not taking away your triple-A games. By extension, I will argue that all of the fads of the moment — nonfungible tokens (NFTs), blockchain, augmented reality, free-to-play mobile games, live services on FIFA Soccer, esports, user-generated content, remakes and retro games — are not taking away from your triple-A games. As Jeff Grubb pointed out, they’re additive. The game industry is expected to hit $175.8 billion in 2021, according to game and entertainment data firm Newzoo. As an industry, it is taking away time from sports, movies, music, TV, and other hobbies.

Public offerings of game companies took off in Q1 2021.

Above: Public offerings of game companies took off in Q1 2021.

Image Credit: InvestGame

The industry has enough money to go around. Everything in games is getting funded. Investors are pouring money into public offerings, acquisitions, and game startup investments. Even indie game makers are benefiting from this, and they continue to be the creative heartbeat of the industry, supplying the innovative games like Hades that triple-A game companies aren’t making. The first quarter saw $39 billion invested into the game industry in 280 announced transactions, according to InvestGame. That quarterly amount was higher than $33 billion reported for all of 2020.

Will mobile games get more budgeted money? Yes. Mobile games are 51% of the market and are growing. PC and consoles games could actually shrink in 2021, based on delays shipping big games during the pandemic. That’s going to happen, as it’s easier to invest in mobile games and increasingly harder to invest in PC and console games, which are often delayed.

“That’s kind of the dirty little secret of the video game business is that it is a business, after all, and we need to do, we need to create an audience, we need to create a revenue stream the cash flow in order to continue to create new and exciting games for people to play,” said Shawn Layden, former chairman of Sony Worldwide Game Studios, said at our recent GamesBeat Summit 2021 event.

You may not trust my answer here, but this is a good thing. The strategy that I see everybody pursuing right now makes perfect sense, and it will be good for all of games.

Why this is good news

First, mobile and free-to-play triple-A games are expanding the market. They are the tip of the spear when it comes to penetrating new markets and convincing people that games are a good use of their time. We’re at 3 billion gamers and growing, but not everybody on the planet is a gamer yet. By making the price of games more accessible, we enable games to reach more people. Those people will pick up the habit. They will find the new point of entry, and they will become gamers, hopefully for life. They will also keep playing these accessible and less time-consuming games even in periods of life when they’re busier, like when they have kids or have to study a lot or have to pour a lot of energy into work.

The key is that they are the point of entry into the vastness of games. Consider Call of Duty. Bobby Kotick, CEO of Activision Blizzard, had some foresight in getting three major game studios to make Call of Duty games in parallel, so that a new one could be launched every year without a sacrifice in the quality of the triple-A game. That wasn’t an easy process, and many accused Kotick of wrecking the franchise by making it too frequent. But the developers didn’t run into creative exhaustion. They converted players into wanting to play Call of Duty every year.

Now nine studios or so are working on Call of Duty. That allowed Activision Blizzard to add the free-to-play games Call of Duty: Mobile and Call of Duty: Warzone. These became the new points of entry for Call of Duty. Call of Duty also went cross-platform so you could play with friends wherever they were. You could start at the top of the funnel, playing for free. Within Warzone, all you had to do to upgrade to the $60 premium game was click a few buttons. Analyst Michael Pachter of Wedbush Securities estimates that Call of Duty premium game sales went up from around 25 million a year to 35 million a year. The result was record performance for Activision Blizzard in 2020. Now people play Call of Duty every year. And if you follow what Kotick said at our GamesBeat Summit 2021 event, increasing the share spent in the day by creating some kind of Call of Duty metaverse is probably the next goal.

Kotick said that the 10,000-person company now needs at least 2,000 more people to meet its production obligations. It’s making triple-A games like Diablo 4, but it is also making the free-to-play Diablo Immortal game for mobile. Do you see the pattern? Kotick is using the same strategy of Call of Duty with Diablo. Mobile and free-to-play games are the onramps to the franchise and you can expect to see Activision Blizzard execute on the same strategy for every major franchise.

No fear

Skull & Bones is looking awesome.

Above: Skull & Bones is coming one of these days from Ubisoft.

Image Credit: Ubisoft

The financial success of Call of Duty and Activision Blizzard isn’t lost on Electronic Arts, which is making a mobile game based on Battlefield. That will be the onramp for Battlefield VI, the triple-A game that is in production. EA has a mobile Apex Legends game that will be the onramp for the free-to-play Apex Legends, and maybe Respawn will fill out the roster with a triple-A Apex Legends (or maybe Titanfall) premium game.

With Ubisoft, the free-to-play The Division: The Heartland can be an onramp to The Division or The Division 2 games. And so on. These efforts are not going to cannibalize each other, in my opinion. They are going to make it more likely that players will become hobbyists. The hobby will not just be games. It will be more specific than that. The hobby will become Call of Duty, or Diablo, or Apex Legends, or The Division. These franchises will command all of our time, and people will constantly cycle through them from the top of the funnel to the bottom.

On our GamesBeat Summit panel, Layden was more focused on Sony’s own specific challenges. But he was right in that platform owners — and by extension the whole game industry — has the responsibility of expanding the market. The lower the price point, the lower risk it is for the industry. Hollywood, by contrast, has been slow to lower the ticket prices of movies. In fact, it raised them just in time for the pandemic. It’s no surprise that streaming movie services took off during the pandemic because they were cheaper. The price spectrum of games captures all the right players.

Hardcore gamers should also be aware that what they want to play isn’t what everyone wants to play. As the game industry expands out of its ghetto of 200 million or 300 million gamers, it will have to serve more diverse content than it ever has, to capture people like older players, international players in emerging markets and different cultures, and women. As it expands to mobile and free-to-play games, the industry should remember that it shouldn’t make just the same old franchises for the new players.

And as everybody becomes a gamer, the game market becomes bigger, the opportunity for each game is higher, and we will get better games of all kinds as a result — including better triple-A games.

The goose and the eggs

Dean Takahashi moderates a new IP panel with Shawn Layden, Ante Odic, and Marty O'Donnell.

Above: Dean Takahashi moderates a new IP panel with Shawn Layden, Ante Odic, and Marty O’Donnell.

Image Credit: GamesBeat

Layden, who had to oversee 13 first-party game studios for the PlayStation business, said that churning out sequels and providing fan service on important franchises is a necessary part of the business. But eventually, everyone comes around to realize the importance of doing original games.

“If we continue to make the same type of game over and over again, we will continue to appeal to the same audience we already have over and over again. We won’t be able to break out gaming into into a wider and larger business. We talk a lot about how video game business is the largest entertainment business in the world. But we really don’t punch above our weight when it comes to society and culture. And I think that’s because we don’t bring a diverse enough audience into enjoying gaming. And that’s why original intellectual property is important.”

Layden knows that going to a board of directors and pitching them a game that will cost $280 million to make over five years isn’t easy. That is a difficult pitch for anybody to make, no matter who you are. But those kinds of bets have to be made.

“It’s definitely problematic that the budgets have skyrocketed,” said Nick Tuosto of Liontree and Griffin Gaming Partners at GamesBeat Summit 2021. “On the other end of the spectrum, that translates to defensibility in a market with 10 million developers working to try to build hits every month. There are precious few that can assemble the budgets have the IP, have the distribution network, and the global brands to be able to compete in a market where people’s time is exceedingly scarce.”

A game like Grand Theft Auto V can sell 140 million units — a number that wasn’t possible more than a decade ago. So the upside is tremendous, and these franchises once established can give birth to live services, media spinoffs in adjacent entertainment markets, and high-margin mobile opportunities. The upside to that initial hundred million investment may be tens of billions in market capitalization for the companies that execute against the opportunity fully.

Grand Theft Auto Online: Arena Wars.

Above: Grand Theft Auto Online: Arena Wars.

Image Credit: Rockstar Games

Also on Layden’s panel was Ante Odic, senior vice president of product at Outfit7, the maker of the Talking Tom series and other games that have been downloaded 15 billion times. Even Outfit7 is investing to find the next Talking Tom as it knows that new IP is so critical. And just because it is investing in Talking Tom doesn’t mean that it isn’t investing in new IP. It’s not a zero-sum game.

“We have a wide audience,” Odic said. “But we want to go even wider.”

Marty O’Donnell, cofounder of Highwire Games and a former leader at Bungie, noted how the creative team wanted to move on from the successful Halo franchise to something new, so much so that they eventually spun Bungie out of Microsoft to be able to reach that aim.

“We wanted to do something new,” O’Donnell said. He reminded us of the fairy tale about the goose that laid the golden eggs. The important thing wasn’t the golden eggs. It was the goose. You don’t want to kill the goose laying the golden eggs, O’Donnell said.

“My slogan is be nice to the goose. And the goose is the team that lays the golden egg,” he said. “And being nice to the golden egg means you’re just going to make sequels that that are dead. But if you’re nice to the team that makes the lays the golden egg, that’s the only way to get really good new golden eggs. Certainly you don’t want to stab the goose and try to cut it open. But all I would ask for for publishers and developers is be nice to the goose because that’s how you’re going to get more eggs.”

A beautiful industry structure

DreamHaven is the new game company started by Mike and Amy Morhaime.

Above: DreamHaven is the new game company started by Mike and Amy Morhaime.

Image Credit: DreamHaven

And remember, if one company retreats from triple-A games, another may attack that opportunity. If Sony were to bail out of triple-A original games and shirk its responsibility, only to focus on sequels and free-to-play low-hanging fruit, it would lose its triple-A creators. They would go to another company like Nintendo or Microsoft or Epic Games or Valve or Ubisoft or Electronic Arts ….You get the point.

They could also seek creative freedom in indie games or start a new triple-A studio. That sort of thing is happening, as Harold Ryan has multiple triple-A games going at Probably Monsters. If Riot Games gets a little sleepy at innovation, the former Riot veterans at Theorycraft Games, which raised $37 million, or the scrappy ex-Riot team at Hidden Leaf Games will be happy to pick up the mantle and hire the Riot leaders who prefer to work on groundbreaking titles.

As I mentioned, a record amount of money is available to the game industry’s creators at all levels, from the newly minted public company Roblox that is worth $39.6 billion to Animoca Brands that has raised $88 billion at a $1 billion valuation to make NFT games to DreamHaven Games, founded by former Blizzard president Mike Morhaime and Amy Morhaime. The game industry has enough money pouring in at once to fund everything that it needs and to make every game that we want. It has never been like this before.

For gamers, don’t worry, be happy. And for game developers, heed what Layden said. “Find the best risks and take them. If you stay the course and keep true to the vision, you will be more delighted with the outcome.”

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

03 ExSp Expiration gradient FINAL captis executive search management consulting leadership board services

Facebook’s new technique helps AI systems forget irrelevant information

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Facebook says it has developed an AI technique that enables machine learning models to only retain certain information while forgetting the rest. The company claims that the operation, Expire-Span, can predict information most relevant to a task at hand, allowing AI systems to process information at larger scales.

AI models memorize information without distinction — unlike human memory. Mimicking the ability to forget (or not) at the software level is challenging, but a worthwhile endeavor in machine learning. Intuitively, if a system can remember 5 things, those things should ideally be really important. But state-of-the-art model architectures focus on parts of data selectively, leading them to struggle with large quantities of information like books or videos and incurring high computing costs.

This can contribute to other problems like catastrophic learning or catastrophic interference, a phenomenon where AI systems fail to recall what they’ve learned from a training dataset. The result is that the systems have to be constantly reminded of the knowledge they’ve gained or risk becoming “stuck” with their most recent “memories.”

Several proposed solutions to the problem focus on compression. Historical information is compressed into smaller chunks, letting the model extend further into the past. The drawback, however, is “blurry” versions of memory that can affect the accuracy of the model’s predictions.

Facebook Expire-Span

Facebook’s alternative is Expire-Span, which gradually forgets irrelevant information. Expire_span works by first predicting which information is most important for a task at hand, based on context. It then assigns each piece of information with an expiration date such that when the date passes, the information is deleted from the system.

Facebook says that Expire-Span achieves leading results on a benchmark for character-level language modeling and improves efficiency across long-context workloads in language modeling, reinforcement learning, object collision, and algorithmic tasks.

The importance of forgetting

It’s believed that without forgetting, humans would have basically no memory at all. If we remembered everything, we’d likely be inefficient because our brains would be swamped with superfluous memories.

Research suggests that one form of forgetting, intrinsic forgetting, involves a certain subset of cells in the brain that degrade physical traces of traces of memories called engrams. The cells reverse the structural changes that created the memory engram, which is preserved through a consolidation process.

New memories are formed through neurogenesis, which can complicate the challenge of retrieving prior memories. It’s theorized that neurogenesis damages the older engrams or makes it harder to isolate the old memories from newer ones.

Expire-Span attempts to induce intrinsic forgetting in AI and capture the neurogenesis process in software form.

Expire-Span

Normally, AI systems tasked with, for example, finding a yellow door in a hallway may memorize information like the color of other doors, the length of the hallway, and the texture of the floor. With Expire-Gan, the model can forget unnecessary information processed on the way to the door and remember only bits essential to the task, like the color of the sought-after door.

To calculate the expiration dates of words, images, video frames, and other information, Expire-Span determines how long the information is preserved as a memory each time a new piece of data is presented. This gradual decay is key to retaining important information without blurring it, Facebook says. Expire-Span essentially makes predictions based on context learned from data and influenced by its surrounding memories.

For example, if an AI system is training to perform a word prediction task, it’s possible with Expire-Span to teach the system to remember rare words like names but forget filler words like “the,” “and,” and “of.” By looking at previous, relevant content, Expire-Span predicts if something can be forgotten or not.

Facebook Expire-Span

Facebook says that Expire-Span can scale to tens of thousands of pieces of information and has the ability to retain less than a thousand bits of it. As a next step, the plan is to investigate how the underlying techniques might be used to incorporate different types of memories into AI systems.

“While this is currently research, we could see the Expire-Span method used in future real-world applications that might benefit from AI that forgets nonessential information,” Facebook wrote in a blog post. “Theoretically, one day, Expire-Span could empower people to more easily retain information they find most important for these types of long-range tasks and memories.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

aurora punks AllLogos ForDarkBackground captis executive search management consulting leadership board services

Aurora Punks unveils DIY collective for indie game studios

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Did you miss GamesBeat Summit 2021? Watch on-demand here! 


Aurora Punks founder Robert Bäckström wanted to find a way to help small indie studios creative innovative games while trying to navigate the complexities of working with big publishers. So he turned to his experience as a punk rocker to create a video game collective.

Bäckström had the experience of being both a successful game developer and a former punk rocker and music publisher. He had a label, Skrammel Records, and it was all about helping scrappy punk bands survive. The rockers had a do-it-yourself (DIY) ethos about helping other bands in the hardcore scene where he grew up. They helped each other arrange shows, produce records, and deal with promotion and other tasks with limited resources, he said in an interview with GamesBeat.

“Working together is key — with minimal distance between creator, instigator, and audience,” he said.

And that’s how he has organized Aurora Punks, as a collective of indie game studios in Stockholm, Sweden. It has a new studio today, Wimasima. Bäckström said that many indie game developers have to approach publishers too soon, when they have very little done, because they have no money to develop a real prototype with a lot of promise. They often settle for deals that favor the publisher and have to make compromises to their vision. The collective has been around for about a year.

Bäckström hopes that Aurora Punks can give indie devs more time to create what they really want to create and survive on their own for longer. Bäckström has more than 15 years of experience developing and publishing games for Raw Fury and Fatshark. After Tencent bought a minority stake in Fatshark in 2019, Bäckström gathered enough money to go out on his own. He wanted to create a company that was all about having developers help each other to survive and preserve their tenacity to do things on their own.

The collective indie studios under the Aurora Punks umbrella work together, sharing resources such as expertise, funding, and network access. They can get resources and revenue to fuel them during the development cycle and develop games that are a mix of arcade, art, retro, innovation, playfulness, and passion. It is neither an angel investor nor a publisher. But it is there to ensure that one failure won’t doom a brand new studio.

aurora punks AllLogos ForDarkBackground captis executive search management consulting leadership board services

Above: Aurora Punks’ studios.

Image Credit: Aurora Punks

“We give them more design freedom, without worrying about money too early,” Bäckström said.

Bäckström said that the company has a total of five game studios in the collective now, including a one called Wimasima. Wimasima has four students trying to break into games.

“Wimasima is just the type of studio we want in the collective — highly skilled, lots of passion, and a no-prestige approach to game development — meaning they are willing to both share and receive knowledge,” Bäckström said.

Those studios have access to talent across the collective, and each member can have a say in whether to admit a new studio into the collective, Bäckström said. Wimasima joins Limit Break, Pixadome Games, Loot Locker (a backend for indies), and Upstream Arcade. All told, 24 people are now in the collective, including a half dozen or so in the main company. Aurora Punks invest around $60,000 to $180,000 per studio.

The first two games coming under the Aurora Punks banner are Robot Lord Rising, a comedic co-op arena card battle game developed by Limit Break, and Chenso Club, a 2D roguelike platformer from Pixadome Games.

With its games, Aurora Punks will help developers get their concepts ready, either for more funding or for a launch. The collective has no portfolio strategy per see, but it does look for teams of passionate developers who will be a good fit in the collective. It promises to provide them with a stable environment where creativity is the major defining factor. It doesn’t have an endless budget. In fact, it has to spend wisely. But it operates in a democratic way with both creative freedom and transparency.

The team is spread from Malmo in the south to Boden in the north. Among the founders are seasoned members of the Swedish game industry scene with Karl Troedsson (ex-DICE), Mathias Wiking (Starbreeze and Paradox), and Alexander Bergendahl (Avalance, Poppermost) on the board of directors. Its studios are as far away as Upstream Arcade in the United Kingdom.

Overall, I like this idea. And it’s not just coming from a former  punk rocker that has no credibility. It’s from a seasoned game executive, and it reminds me of the remark that Shawn Layden, former chairman of Sony Worldwide Studios, said at our recent GamesBeat Summit 2021 event. He said young developers should hold onto and own and develop their ideas before they go to publishers, who may want to own them.

“For young people with your ideas, try to hold on to them as long as you can,” Layden said.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

101841526 ebf6ff00 3afa 11eb 965d 5ce11efdb9f8 1 captis executive search management consulting leadership board services

GitHub now lets all developers upload videos to demo bugs and features

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


GitHub has officially opened up video uploads five months after launching in beta, allowing all developers to include .mp4 or .mov files directly in pull requests, discussions, issues, comments, and more.

The feature is designed to help developers visually demonstrate to project maintainers the steps they went through when they encountered a bug, for example, or illustrate what a major new code change achieves in terms of functionality.

So rather than having to follow detailed step-by-step textual instructions which may be ambiguous or unclear, it’s now easier to see exactly what’s happening at the other end first-hand and should go some way toward avoiding time-consuming back-and-forth written discussions. This could also be used in conjunction with a voice track with a narrator explaining the on-screen actions.

101841526 ebf6ff00 3afa 11eb 965d 5ce11efdb9f8 1 captis executive search management consulting leadership board services

Above: Video in GitHub

It’s worth noting that with this launch, GitHub also now fully supports video uploads from within its mobile app.

ezgif.com gif maker 2 captis executive search management consulting leadership board services

Above: Uploading video to GitHub via mobile app

Seeing is believing

Native video upload support helps bypass the cumbersome alternative involving recording and uploading a video to a third-party platform, then sharing a link. On that note, GitHub actually doesn’t yet support video unfurling from shared links, but that is something it said that it’s working on, alongside enabling video annotations for specific pieces of code.

At a time when the world has had to adapt to remote work and collaboration, learning to embrace asynchronous communication is one of the fundamental factors for distributed teams to succeed — recorded video plays a big part in enabling this.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

GoogleCloud captis executive search management consulting leadership board services

Google Cloud CEO predicts boom in business-process-as-a-service

GoogleCloud captis executive search management consulting leadership board services

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Google Cloud CEO Thomas Kurian is focusing on automation and business process improvement as the company seeks to gain ground on cloud computing competitors.

Speaking at the virtual Automation Anywhere Imagine Digital conference, Kurian said a shift in focus to business processes as a service will define enterprises’ future cloud migrations.

The traditional on-ramp to the cloud was about technical integration. That meant migration of enterprise apps to virtual machines, data to cloud databases, and refactoring apps to scale more efficiently with microservices, Kurian said. Now, business executives’ traditional focus on practical business problems — the search for new opportunities, improving customer experiences, and reducing the costs — will come more into play.

“By abstracting the underlying technology, and making it easier and more efficient to get the business process, you can accelerate time to value,” Kurian said. “People really want to find places where they can unlock value. And the places where they unlock value are in every touchpoint with the customer.”

At the conference, Kurian weighed in on how ongoing efforts will strategically shift the on-ramp to the cloud. He said enterprises will work to streamline the way that they can define, execute, and operate a core business process, whether that is a loan origination process in a financial institution, an accounts receivable process in a traditional company, or an order to cash process in a manufacturing institution.

One of the daunting things about moving to the cloud is that businesses discover they are putting an “internal mess” on display for the world to see, Kurian said. Companies capable of cleaning up a business process mess will improve their time to market, predictability, and customer experiences.

Migrating processes to the cloud

Kurian expanded on the three-step processes involved in migrating business processes to the cloud.

The first step is to use data to understand what the most valuable processes might be to automate. This may be driven by cost reduction or seeking competitive advantage by speeding up, for example, the process of originating a loan compared to competitors.

The second step lies in finding the best way to automate the business process. Process mining and process discovery can find ways to reduce certain steps from the process. These can be implemented with RPA and low-code tools.

The third step lies in using analytics on the processes that have been automated to understand how the company could get even better, said Kurian. Process analytics tools can assess the value of automations, prioritize them, and calibrate the estimates with actual results.

He said the combination of better process automation and analytics tools using Google’s AI will allow executives to track the efficiency of the processes they have instrumented for constant improvement.

Expanding the territory of AI

Kurian argued the combination of better process tools with AI will support the next level of abstraction for natural language processing (NLP), one of Google’s strong suits.

Google started off by enabling NLP to understand and translate words, Kurian told the conference attendees. Then the company’s engineers saw there was more value in translating sentences. When they started applying these tools to customer service, they realized it was important to be able to interpret the ongoing conversation people were having about an issue, so that customers did not have to repeat themselves each time.

Kurian said that bringing business processes into NLP will extend the boundary of what AI can do even further. For example, an expert involved in approving loans may have focused on understanding and comparing certain fields in the application in making the loan. Once this process is digitized, the company can scale that particular person by capturing their process and decisions into an RPA bot. Kurian positioned this as “the core to streamlining how organizations function, and how organizations can improve the efficiency and speed of their core processes.”

Working together toward RPA advancement

For now, Google appears intent on pursuing alliances and partnerships in such key technology areas as RPA. That has occurred as players including Microsoft, IBM, and ServiceNow have acquired RPA-oriented startups.

Both Google Cloud and Automation Anywhere have made significant pivots recently to meld business services with cloud infrastructure. Just this year, these pivots aligned around a new partnership between the two to accelerate the adoption of RPA on a global scale. This included technology integration, joint solution development, and aligning sales and marketing efforts.

Google hired Kurian, a seasoned Oracle exec, to infuse a pragmatic business focus into its technology-centric commercial cloud undertaking. Meanwhile, Automation Anywhere took an extended pause on product development to migrate its core robotic process automation (RPA) platform to a native cloud architecture.

“Digitization requires complete front-to-back automation, not just for efficiency, but for competitive advantage, and that’s the vision that both our companies share,” Kurian said.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

GettyImages 1171615768 1 e1619463823181 captis executive search management consulting leadership board services

Hybrid multiclouds promise easier upgrades, but threaten data risk

GettyImages 1171615768 1 e1619463823181 captis executive search management consulting leadership board services

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Enterprises see hybrid multicloud as a promising path to new customers and digital transformation — and as a quick on-ramp to rejuvenating IT and driving new revenue models. But many enterprises err badly as they migrate decades-old legacy systems to public, private, and community clouds, accidentally allowing bad actors access to their company’s most valuable data.

Marketing claims promise enterprises they can continue to get security and value out of datacenters if they choose hybrid cloud as their future. For many enterprises, the opposite is true. Hybrid multicloud brings greater risk to data in transit and at rest, opening enterprises to more cyber threats and malicious activity from bad actors than they ever encountered before.

Getting hybrid cloud security right is hard

By definition, a hybrid cloud is an IT architecture comprising legacy IT systems integrated with public, private, and community-based cloud platforms and services. Gartner defines hybrid cloud computing as policy-based and coordinated service provisioning, use, and management across a mixture of internal and external cloud services. Hybrid clouds’ simple definition conflicts with the complexity of making them work securely and at scale.

What makes hybrid multicloud so challenging to get right from a security standpoint is how dependent it is on training people and keeping them current on new integration and security techniques. The more manual the hybrid cloud integration process, the easier it is to make an error and expose applications, network segments, storage, and applications.

How pervasive are human-based errors in configuring multiclouds? Research group Gartner predicts this year that 50 percent of enterprises will unknowingly and mistakenly expose some applications, network segments, storage, and APIs directly to the public, up from 25% in 2018. By 2023, nearly all (99%) of cloud security failures will be tracked back to manual controls not being set correctly.

What defines the dark side of hybrid multiclouds?

The promises of hybrid multiclouds need to come with a disclaimer: Your results may vary depending on how deep your team’s expertise is on multiple platforms extending into compliance and governance. Hybrid multiclouds promise to provide the following under ideal conditions that are rarely achieved in organizations today:

  • Integrate diverse cloud platforms and infrastructure across multiple vendors with little to no degradation in data latency, vendor lock-in, or security lapses.
  • Autonomously move workloads and data at scale between legacy, third-party legacy on-premises systems, and the public cloud.
  • Support and securely scale for edge computing environments as enterprise spending is surging in this area today. Bain’s analysis of IDC data anticipates spending on edge computing infrastructure and environments will grow at a 35% CAGR between 2019 and 2024, compared with approximately 2.5% growth of nonpublic cloud spending.

Enterprises need to work their way through the dark side of hybrid multiclouds to see any benefits. While the challenges are unique to the specific enterprise’s legacy systems, previous results in public, private, and hybrid cloud pilots and proofs-of-concept are a reliable predictor of future results.

The roots of risk

In reality, hybrid multicloud platforms are among the riskiest and most challenging to get right of any IT infrastructure. According to Bain’s Technology Report 2020:Taming the Flux, the average organization relies on 53 different cloud platform services that go beyond basic computing and storage.

Bain’s study found that CIOs say the greater the complexity of multicloud configurations, the greater the security and downtime risks their entire IT infrastructures are exposed to. CIOs also told Bain their organizations are struggling to develop, hire, and retain the talent needed to securely operate one cloud infrastructure at scale, let alone several.

That heads a list of indicators that innovative enterprises are seeing as they work to improve their hybrid multicloud security. The indicators include:

  • Lack of ongoing training and recertification. Such training helps to reduce the number and severity of hybrid cloud misconfigurations. As the leading cause of hybrid cloud breaches today, it’s surprising more CIOs aren’t defending against misconfigurations by paying for their teams to all get certified. Each public cloud platform provider has a thriving sub-industry of partners that automate configuration options and audits. Many can catch incorrect configurations by constantly scanning hybrid cloud configurations for errors and inconsistencies. Automating configuration checking is a start, but a CIO needs a team to keep these optimized scanning and audit tools current while overseeing them for accuracy. Automated checkers aren’t strong at validating unprotected endpoints, for example.
  • Automation efforts often overlook key factors. It is necessary to address inconsistent, often incomplete controls and monitoring across legacy IT systems. That is accompanied by inconsistency in monitoring and securing public, private, and community cloud platforms.
  • Lack of clarity on who owns what part of a multicloud configuration continues because IT and the line of the business debate who will pay for it. Addressing the lack of clarity regarding each cloud instance is the responsibility of a business IT leader or the core IT team. Line of business leaders’ budgets are charged for hybrid multicloud integration projects that digitally transform a business model. But data and IT governance, security, and reliability can fall on the line between the business and IT, creating confusion — and opening the door for bad actors searching for gaps in hybrid cloud configurations.
  • Accountability lines between cloud providers and customers get blurred as well. With cloud providers taking on more responsibility for managing all aspects of hardware and software co-hosted in their datacenters, there’s more confusion than ever on who covers the gaps in system and cybersecurity configurations.
  • The overhyped benefits of cloud elasticity and paying-as-you-go for computing resources can obscure the overall picture. Important details too often get buried in complex, intricate cloud usage reporting invoices from public cloud providers. It’s easy to get lost in these lengthy reports and overlook essential cloud security options. Later in this series of articles, I’ll address the limitations and misconceptions of the Shared Responsibility Model.

Mind the multicloud gaps

Lack of compliance and governance are the most costly errors enterprises are making today when it comes to hybrid multicloud deployments. Not only are they paying the fines for lack of compliance, but they’re also losing customers forever when their data is compromised in a breach. Gaps between legacy systems and public, private, and community clouds that provide bad actors an open door to exfiltrate customer data violate the California CCPA laws and the EU’s GDPR laws.

Enterprises can achieve more real-time visibility and control across all cloud instances by standardizing on a small series of monitoring tools. That means trimming back, to better ensure assorted tools don’t conflict with each other.

How quickly any given business can keep reinventing itself and digitally transform how it serves customers depends on how quickly IT can adapt. Leaders must understand that hybrid multicloud is an important strategy, but the hype doesn’t match the reality. Too many organizations are leaving wide gaps between cloud platforms.

The recent high-profile SolarWinds breach exposed hybrid multicloud’s weaknesses and showed the need for Zero Trust frameworks. In the next article in this series, I’ll look at the lessons learned from the SolarWinds hack and how greater understanding can help strengthen compliance and governance of any hybrid cloud initiative. Machine learning and terrain analytics show promising potential to identify and troubleshoot hybrid multicloud security gaps as well, and this too will be explored in the upcoming series.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

Quadrant Report captis executive search management consulting leadership board services

Data, analytics, and digital transformation

Quadrant Report captis executive search management consulting leadership board services

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


This post was written by Andrew Spanyi, president of Spanyi International

Accurate, complete, and timely data has always been required for success with digital programs. This is even more the case when it comes to large, enterprise-wide digital transformations. Yet, a recent New Vantage survey reported that just 24% of respondents they thought their organization was data-driven, a decline from 37.8% the prior year. Just as analytical tools are becoming in widespread use, requiring even more reliable data, it’s becoming increasingly difficult to be a data-driven company. Puzzling, isn’t it?

What is the reason for this plunge in becoming data driven? The same New Vantage survey reported that cultural challenges — not technological ones — represented the largest impediment and as many as 92.2% of mainstream companies reported that they have struggled with issues such as organizational alignment, business processes, change management, communication, skill sets, and resistance to change.

There is no shortage of advice on how to become more data driven. For example, SAS and TDWI suggest that better collaboration, improved data quality, and a greater focus on governance are part of the answer. Thomas H. Davenport and Nitin Mittal recommended in Harvard Business Review last year that the initiative be driven top down and that organizations pay attention to the use of cross-functional teams, along with other factors such as leading by example, providing specialized training and using analytics to help employees.

Why is it so hard?

Most executives acknowledge the importance of data in digital transformation, but when it comes to their own decision making, they are more likely to make intuition and gut feel driven decisions. After all, it’s their many years of experience that has landed them in their position of authority — isn’t it? Also, gathering high quality data can be problematic as department heads have hoarded data for decades in hard to access excel spreadsheets and the IT applications which have often been developed to meet specific departmental needs don’t communicate well with one another. Moreover, bridging data silos is difficult as such initiatives tend to rely on the IT department, which often has other more pressing priorities. Also, doing the analysis takes time — and it’s quite complicated. The amount of patience needed to overcome the challenges of data transparency and the patience needed in waiting for the time it takes to carry out analytics are not commonly observed traits of typical executive behavior. While there is no one universal recipe, paying attention to organizational alignment, cross functional business processes, and executive education is likely to improve the odds of success.

Improving alignment

Most executives today would agree that organizational alignment is important. In theory, strategies, organizational capabilities, resources, and management systems should all be arranged to support the enterprise’s purpose. In practice, when it comes to digital transformation — let’s just say — it’s complicated. When individual departments place greater emphasis on their own strategy than that of the organization — then alignment suffers. When there is a greater focus on variance to budget performance by department as opposed to customer value creation – then alignment weakens. This is particularly pertinent to digital transformation, as strategy — not technology — drives digital transformations. Only the CEO can provide the needed momentum to improve organizational alignment by instructing department heads to work together in crafting a company wide strategy and acting in unison on gathering the right data as well as measuring what matters.

Addressing process issues

If an organization focuses solely on workflow and processes inside of departmental boundaries — then fragmentation drives data transparency issues, and data driven decisions suffer. An enterprise wide, high level process context is needed to overcome such fragmentation. According to one recent survey 26% of survey respondents said they have any data strategy at all, and 70% don’t have what they consider to be a mature data strategy. A back-to-basics approach is useful in creating a high-level process context with a focus on the core activities of getting products/services developed, made, sold and delivered. This approach would highlight the 12 to 16 end-to-end processes that typically determine organizational capability for most firms. A linear depiction of these processes is not enough. An effective framework must also draw attention to the activities, the cross functional roles and the applications and data needed for exceptional performance.

Most organizations will find that paying attention to key cross functional processes such as “order to delivery”, “request to resolution” and “idea to launch” can pay huge dividends in terms of identifying what data is needed for digital success and at the same time improving customer experience. Similarly, focusing on the key internal business processes that have a major impact on employee experience, such as “requisition to onboard” and “requirements to implementation” can create the right context and the needed focus to drive a data driven approach. The right foundation is created by getting people from the various departments involved in such cross functional business processes to work together in data driven environment to solve problems that are known to matter. For example, in the “order to delivery” process, collaboration is typically needed between sales, operations and customer service.

So, it’s not just about forming cross-functional teams that combine people with different backgrounds such as data analytics, business, and technology — although that’s important too. It’s also about the right context that creates focus, drives cross functional collaboration and management attention on highly visible business issues that is even more valuable. This approach is far superior to viewing data requirements one department at a time.

Providing executive training

There’s no shortage of courses on data and analytics.  Wharton, the University of Toronto, and MIT are just a few of the prestigious universities with solid offerings. There’s just one problem — data and analytics can be boring in the abstract. That’s why it’s important to apply analytics to real, pressing problems in the context of end-to-end processes. However, so doing takes both a systemic and systematic approach to big data and analytics in a big picture context of digital transformation. That is sometimes challenging as both CEOs and IT departments are often busy putting out fires — but it can be done with discipline. To improve the odds of success, SAS recommends paying attention to factors such as a balanced focus on developing business skills as well as technical skills, discipline in performance measurement, and an accelerated approach to change management.

How are you doing?

Instead of just thinking about deploying a given individual technology tool for the benefit of an individual department, leaders need to shift attention to deploying multiple tools with reliable, accessible data in an integrated, agile manner for the benefit of customers and the business.

Focusing on customer experience and a set of highly visible business problems or opportunities in a process context form the foundation for data driven digital transformation. That’s quite different than a traditional, siloed, departmental approach and involves an outside-in view to drive cross functional collaboration.

How are you doing? Consider answering the following questions.

  1. Do individual departments place greater emphasis on their own strategy than that of the organization?
  2. Is process modeling primarily focused on small processes inside of departmental boundaries?
  3. Do process improvement projects tend to have small, incremental improvement goals?
  4. Do key performance indicators (KPI’s) have a visible bias towards volume and cost?
  5. Are your executives more concerned about their department than on creating value for customers?
  6. Is organization wide restructuring carried out frequently?
  7. Do department heads view one another as competitors for the top job as opposed to collaborators?
  8. Are IT projects often launched and executed in response to individual departmental needs?

If you answered “YES” to four or more of the above questions, then your company may find it particularly challenging to apply data-based decision making in your digital programs.

You are probably not alone. Tom Davenport and Randy Bean have been reporting on data driven transformations for over 8 years and found that companies continue to struggle despite substantial investments in technology and applications. Paying attention to organizational alignment, cross functional business processes, and executive education can change the odds of success.

Andrew Spanyi founded Spanyi International,  a professional service firm providing educational, coaching and consulting services, in 1991. Andrew’s contribution to business process management (BPM) is widely recognized. He is the author of three books, several book chapters,  and over 100 articles. He has delivered speeches and workshops in more than 10 countries around the world. He has worked on over 170 major performance improvement projects for clients in industries such as aerospace, banking, government, insurance, petro-chemical, pharmaceutical, and telecommunications.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member