fb image 3153 captis executive search management consulting leadership board services

Ally partners with Microsoft to explore quantum computing use cases in fintech

fb image 3153 captis executive search management consulting leadership board services

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Microsoft today announced that fintech firm Ally will join the former’s Azure Quantum program to explore how quantum computing can create opportunities in the financial sector. The two companies say that they’ll apply research on quantum-inspired algorithms to understand ways they can be tapped to solve complex optimization challenges.

Experts believe that quantum computing, which at a high level entails the use of quantum-mechanical phenomena like superposition and entanglement to perform computation, could one day accelerate AI workloads compared with classical computers. Scientific discoveries arising from the field could transform energy storage, chemical engineering, drug discovery, financial portfolio optimization, machine learning, and more, leading to new business applications. Emergen Research anticipates that the global quantum computing market for the enterprise will reach $3.9 billion by 2027.

Using optimizers designed to run on classical hardware in the cloud, Ally says it’ll be able to explore quantum use cases without having to invest in specialized hardware. Collaborating with Microsoft as part of its Enterprise Acceleration Program, Ally plans to begin developing quantum subject-matter expertise and fostering relationships with an ecosystem of quantum computing partners.

“We have been able to benefit from the deep experience of the Microsoft Quantum research team, learn about the types of problems quantum can help solve within the financial services industry, and begin developing quantum computing skills using Microsoft’s quantum development kit,” Ally wrote in a blog post. “Through this relationship, we are gaining the insight and experience from Microsoft’s leading researchers. Leveraging Microsoft’s Azure capabilities, Ally has access to the software languages, APIs, and infrastructure to build quantum skills. This will open many exciting opportunities and empower us to explore the use of quantum computing technology across the landscape of financial services [applications].”

As for the problems Ally hopes to solve with quantum computing, they could address anything from knowing why a customer is likely to contact a call center to portfolio management and streamlining business processes, according to Microsoft. It’s Microsoft Quantum director Julie Love’s belief that future algorithms and quantum hardware might draw on Ally’s financial datasets to help professionals make decisions.

For example, Monte Carlo, a popular method for analyzing risk in finance, requires many simulations to achieve high confidence. That’s because Monte Carlo simulations account for uncertainty and randomness by constructing probability distributions over many possible outcomes as opposed to one.  In finance, Monte Carlo methods are applied to portfolio evaluation, planning, risk evaluation, and derivatives pricing. Quantum algorithms might improve these computations since quantum computers can run multiple scenarios simultaneously, and through quantum interference reduce the error in simulation.

“Technology has always played an important role in the financial sector, and it has been shown, time and time again, that those who lead with technology make the market … Even though large-scale quantum computers won’t be available in the near term, their future availability is something for which businesses across the board need to prepare,” Love said in a press release. “With access to Azure Quantum, Ally is preparing a quantum-ready workforce that will be able to leverage scalable hardware as it becomes available.”

Private sector adoption

While work progresses toward viable quantum computing hardware, the private sector is increasingly investing in the technology. Last month, IBM announced it would install a quantum computer at the Cleveland Clinic, marking one of the first times the company has physically placed a quantum computer on-premises. And Microsoft previously partnered with global advisory Willis Towers Watson to experiment with ways quantum computing might assist the firm with its work in insurance, financial services, and investing.

“Current modelling techniques to quantify risk require a huge amount of computing power, using thousands of computers over many hours,”
Willis Towers Watson CEO John Haley said in a whitepaper. “Quantum computing offers us the chance to look at our clients’ problems in a different way. By focusing on how we would model the problems on quantum computers when they become available at scale, we are able to work with Microsoft to redefine the problems and speed up our solutions on existing hardware.”

Microsoft launched Azure Quantum in private preview two year ago alongside a developer kit, compilers, and simulators. Partnerships with quantum hardware providers Honeywell, IonQ, or QCI enable developers in the program to use existing Microsoft products — like Visual Studio or the Quantum Development Kit — along with quantum computers.

Quantum venture funding dipped 12% in 2020 but quantum investments rose 46%, according to CB Insights. The total amount raised in the sector reached $365 million that year.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

fb image 4106 captis executive search management consulting leadership board services

Supervised vs. unsupervised learning: What’s the difference?

fb image 4106 captis executive search management consulting leadership board services

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


At the advent of the modern AI era, when it was discovered that powerful hardware and datasets could yield strong predictive results, the dominant form of machine learning fell into a category known as supervised learning. Supervised learning is defined by its use of labeled datasets to train algorithms to classify data, predict outcomes, and more. But while supervised learning can, for example, anticipate the volume of sales for a given future date, it has limitations in cases where data falls outside the context of a specific question.

That’s where semi-supervised and unsupervised learning come in. With unsupervised learning, an algorithm is subjected to “unknown” data for which no previously defined categories or labels exist. The machine learning system must teach itself to classify the data, processing the unlabeled data to learn from its inherent structure. In the case of semi-supervised learning — a bridge between supervised and unsupervised learning — an algorithm determines the correlations between data points and then uses a small amount of labeled data to mark those points. The system is then trained based on the newly-applied data labels.

Unsupervised learning excels in domains for which a lack of labeled data exists, but it’s not without its own weaknesses — nor is semi-supervised learning. That’s why, particularly in the enterprise, it helps to define the business problem in need of solving before deciding which machine learning approach to take. While supervised learning might be suited for tasks involving classifying, like sorting business documents and spreadsheets, it would adapt poorly in a field like health care if used to identify anomalies from unannotated data, like test results.

Supervised learning

Supervised learning is the most common form of machine learning used in the enterprise. In a recent O’Reilly report, 82% of respondents said that their organization opted to adopt supervised learning versus supervised or semi-supervised learning. And according to Gartner, supervised learning will remain the type of machine learning that organizations leverage most through 2022.

Why the preference for supervised learning? It’s perhaps because it’s effective in a number of business scenarios, including fraud detection, sales forecasting, and inventory optimization. For example, a model could be fed data from thousands of bank transactions, with each transaction labeled as fraudulent or not, and learn to identify patterns that led to a “fraudulent” or “not fraudulent” output.

Supervised learning algorithms are trained on input data annotated for a particular output until they can detect the underlying relationships between the inputs and output results. During the training phase, the system is fed with labeled datasets, which tell it which output is related to each specific input value. The supervised learning process progresses by constantly measuring the resulting outputs and fine-tuning the system to get closer to the target accuracy.

Supervised learning requires high-quality, balanced, normalized, and thoroughly cleaned training data. Biased or duplicate data will skew the system’s understanding, with data diversity data usually determining how well it performs when presented with new cases. But high accuracy isn’t necessarily a good indication of performance — it might also mean the model is suffering from overfitting, where it’s overtuned to a particular dataset. In this case, the system will perform well in test scenarios but fail when presented with a real-world challenge.

One downside of supervised learning is that a failure to carefully vet the training datasets can lead to catastrophic results. An earlier version of ImageNet, a dataset used to train AI systems around the world, was found to contain photos of naked children, porn actresses, college parties, and more — all scraped from the web without those individuals’ consent. Another computer vision corpus, 80 Million Tiny Images, was found to have a range of racist, sexist, and otherwise offensive annotations, such as nearly 2,000 images labeled with the N-word, and labels like “rape suspect” and “child molester.”

Semi-supervised learning

In machine learning problems where supervised learning might be a good fit but there’s a lack of quality data available, semi-supervised learning offers a potential solution. Residing between supervised and unsupervised learning, semi-supervised learning accepts data that’s partially labeled or where the majority of the data lacks labels.

The ability to work with limited data is a key benefit of semi-supervised learning, because data scientists spend the bulk of their time cleaning and organizing data. In a recent report from Alation, a clear majority of respondents (87%) pegged data quality issues as the reason their organizations failed to successfully implement AI.

Semi-supervised learning is also applicable to real-world problems where a small amount of labeled data would prevent supervised learning algorithms from functioning. For example, it can alleviate the data prep burden in speech analysis, where labeling audio files is typically very labor-intensive. Web classification is another potential application; organizing the knowledge available in billions of webpages would take an inordinate amount of time and resources if approached from a supervised learning perspective.

Unsupervised learning

Where labeled datasets don’t exist, unsupervised learning — also known as self-supervised learning — can help to fill the gaps in domain knowledge. Clustering is the most common process used to identify similar items in unsupervised learning. The task is performed with the goal of finding similarities in data points and grouping similar data together.

Clustering similar data points helps to create more accurate profiles and attributes for different groups. Clustering can also be used to reduce the dimensionality of the data where there are significant amounts of data.

Reducing dimensions, a process that isn’t unique to unsupervised learning, decreases the number attributes in datasets so that the data generated is more relevant to the problem being solved. Reducing dimensions also helps cut down on the storage space required to store datasets and potentially improve performance.

Unsupervised learning can be used to flag high-risk gamblers, for example, by determining which spend more than a certain amount on casino websites. It can also help with characterizing interactions on social media by learning the relationships between things like likes, dislikes, shares, and comments.

Microsoft is using unsupervised learning to extract knowledge about disruptions to its cloud services. In a paper, researchers at the company detail SoftNER, a framework that Microsoft deployed internally to collate information regarding storage, compute, and outages. They claim that it eliminated the need to annotate a large amount of training data while scaling to a high volume of timeouts, slow connections, and other product interruptions.

More recently, Facebook announced SEER, an unsupervised model trained on a billion images that ostensibly achieves state-of-the-art results on a range of computer vision benchmarks. SEER learned to make predictions from random pictures found on Instagram profile pages.

Unfortunately, unsupervised learning doesn’t eliminate the potential for bias in the system’s predictions. For example, unsupervised computer vision systems can pick up racial and gender stereotypes present in training datasets. Some experts, including Facebook chief scientist Yann LeCun, theorize that removing these biases might require a specialized training of unsupervised models with additional, smaller datasets curated to “unteach” specific biases. But more research must be done to figure out the best way to accomplish this.

Choosing the right approach

Between supervised, semi-supervised, and unsupervised learning, there’s no flawless approach. So which is the right method to choose? Ultimately, it depends on the use case.

Supervised learning is best for tasks like forecasting, classification, performance comparison, predictive analytics, pricing, and risk assessment. Semi-supervised learning often makes sense for general data creation and natural language processing. As for unsupervised learning, it has a place in performance monitoring, sales functions, search intent, and potentially far more.

As new research emerges addressing the shortcomings of existing training approaches, the optimal mix of supervised, semi-supervised, and unsupervised learning is likely to change. But identifying where these techniques bring the most value — and do the least harm — to customers will always be the wisest starting point.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

fb image 288 captis executive search management consulting leadership board services

Rob Kostich interview: After 400 million Call of Duty games sold, Activision still has big plans ahead

Join GamesBeat Summit 2021 this April 28-29. Register for a free or VIP pass today.


The Call of Duty franchise is one of the strongest in video games, with more than 400 million copies sold to date.

Call of Duty: Warzone and Call of Duty: Black Ops — Cold War are moving to Season 3‘s new content today, and that gave us a reason to catch up with the boss, Rob Kostich. He’s the president of Activision Publishing and the head of the Call of Duty franchise.

There have been 19 different Call of Duty games since 2003, if you count both the free-to-play battle royale Warzone, which has been downloaded 100 million times, and Call of Duty: Mobile, which has been downloaded 300 million times. The franchise isn’t fatigued yet, and it has made it through some difficult times, such as the departure of its founding developers as Call of Duty went multi-studio development. It made the leap to free-to-play and its premium version is still selling extraordinarily well.

I’ve long wondered what Activision’s vision and strategy are for the franchise. I got some answers from Kostich. He’s been thinking about the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One. And he’s been contemplating how to get us to come back to some part of Call of Duty, whether it’s Warzone or a mobile platform, every day of the year.

Next week, I’ll be interviewing his boss, Activision Blizzard CEO Bobby Kotick onstage at our GamesBeat Summit 2021 event.

Here’s an edited transcript of our interview.

fb image 288 captis executive search management consulting leadership board services

Above: Rob Kostich is president of Activion and head of Call of Duty.

Image Credit: Activision

GamesBeat: It seems like there’s been both deliberate and accidental steps in the evolution of Call of Duty. Zombies became this second or third experience that comes with the game. You had multiple studios launching the game, alternating every year. Then you had Warzone running year round, and Call of Duty Mobile. How do you look at what’s deliberate and what’s opportunistic in that evolution?

Rob Kostich: We’ve been planning this a lot over the last few years. The one thing we started with, we had the premium business coming out every fall with Call of Duty. We wanted to do a lot of things, and one thing we saw was continuing to pull our community closer together. That happened when, before Modern Warfare launched, we started announcing cross-progression, cross-play, new season pass, changing our monetization system. Everything we can do to bring everyone together and provide free content to our fans at the same time. The big thing we wanted to do was get the community together and get them having fun.

Warzone was the thing that was transformational to all of it. Certainly not everyone on the planet has the ability to pay $60 or the equivalent to play Call of Duty. For my money, Call of Duty is the best moment to moment action experience there is. Warzone has allowed everyone to come in and experience Call of Duty. Now it’s become the focal point, the central point, the welcome mat if you will for the franchise as we go forward.

What’s important to us is we give all of our fans an incredible fun experience with Call of Duty, whether you’re free-to-play or premium, whoever you are. In Warzone that’s the first entry point, where you’ll experience the latest and greatest the franchise has to offer. You’ll go on a narrative journey with us through time. It’s the thing that’s transformed our business. It’s made our players more excited about our premium offerings as well. They get engaged in Call of Duty, all it has to offer across Zombies and everything else.

You mentioned mobile as well. Mobile’s been an incredible way–you’ve seen the headlines, where we’ve scaled to more than 300 million downloads. We have a nice scale on that business, and now we’re also launching in China. We’ve been able to tap into new audiences unlike ever before with the franchise. What’s fascinating is–I’ve certainly been around a long time, and it’s crazy. We launched the first Call of Duty in 2003. The franchise has never been bigger, been more relevant, and impacted more people in a positive way. We’re thrilled and excited about the prospects ahead of us as we continue to evolve the franchise for our community.

GamesBeat: What else is there to do for the franchise, and how do you structure the teams going forward to do that? It seems a lot more complex than just three studios trading off each year now. You have different studios doing different pieces, like multiplayer or Zombies. How does that structure look now?

Kostich: From a structure standpoint, one of the most important things for us–we have incredible development teams. As you know, in the creative process, wanting to keep these guys accountable and passionate about the things they work on–I can tell you one thing. They’re so passionate, whether they’re working on the premium games or Warzone, and how that’s impacted the community in such a positive way.

When we launched Warzone, that was launched in partnership with Infinity Ward and Raven. Raven is now taking over Warzone in terms of live ops as we move this thing forward into the future. They’ve done an amazing job. All of our studios are collaborating and participating in that process to make sure we do this in the right way going forward, integrating our offerings together in a way that the community is excited about.

Probably the greatest news for me is I’ve never seen our studios working together better than at this point in time. They’re super collaborative. They love the opportunity. They see how people are enjoying Warzone and everything we’re doing with Call of Duty. It’s been an awesome experience for the last 12 to 18 months, how our studios have come together and are charting the course for the future of Call of Duty right now.

GamesBeat: I hear there’s something like 2,000 people working on Call of Duty. That sounds very impressive, but it also sounds like you need more.

Kostich: I don’t know if we’ve actually disclosed a total number. But we have a very big team on this. What I’ll say is that we are hiring as we move into the future. We have so many opportunities in front of us. Most of our studios are hiring very aggressively right now. In particular, we’re hiring on the mobile front. As you know, on the world’s biggest platform, I think we have incredible opportunities to expand our franchise in even greater ways. We’re hiring across console and PC development. We’re hiring across mobile development. Our opportunities are bigger than they’ve ever been, and I mean that in terms of the community and the great experiences that we can provide them as we move forward.

Dean Takahashi's Warzone report: Not very impressive, but points for persevering.

Above: Dean Takahashi’s Warzone report: Not very impressive, but points for persevering.

Image Credit: Activision/Twitter

GamesBeat: When you think about the most successful games in the past, people talk about market share, but it seems like what’s happening here is you’re getting a bigger share of time. How do you get people to come back to Call of Duty every day, rather than just every fall?

Kostich: We’ve gotten a bit of a crash course in that the last year, year and a half or so, across what we’re doing in mobile, what we’re doing in console and PC as well. It’s pretty simple. We need to surprise and delight our community. We have to provide them with new ways to play, new experiences. With season three I think we’re doing a fun thing right now as we transition out of season two, into Rebirth Island in the middle as launch into season three. We’re providing new play spaces, new ways to play.

Our focus is continuing, in terms of Warzone, to push the battle royale genre forward in every way possible for our community. That’s what’s going to keep them coming back. Across our free-to-play and premium experiences, we need to keep pushing forward for our community. They deserve it. That’s what our development team is 100 percent committed to doing. For Warzone in particular we have plans years into the future now for the things we have to do. We’ve been thinking hard about this. We know how important it is to our fans. Our team is super excited to deliver on that for the community.

GamesBeat: Do you think fans would go for a Call of Duty metaverse?

Kostich: The opportunity is there for sure. Within Warzone we probably have more flexibility to explore things like that than ever before. We’re already starting to mix universes a bit. Most important, at its core, is that we provide an incredible Call of Duty experience to our fans, which we will absolutely do. There’s a lot of fun narrative things we can do over time now in the Call of Duty metaverse and how that evolves over the next few years.

GamesBeat: You’ve been quiet about the next Call of Duty. Are you shifting toward announcements later in the year for the new games? Last year was also fairly late in the cycle as far as revelations go.

Kostich: We’re probably shifting a bit more in that direction. Most of the reason is–you’ve seen what we have in season three this week. We have so much to talk about and so much going on that’s happening this week. We want to focus on that with the community, focus on the journey with them. Also, as you saw last year, we did some cool things in terms of integrating the reveal of Black Ops into Warzone. Those are the things we want to orchestrate and provide to our community, letting them discover Call of Duty themselves in their play experience. That part’s been fun for us and our development teams. Marketing is changing within Call of Duty, how we get the community to participate and uncover things for us. It might be happening later, but it’s all part of a broader agenda to bring the community along on a fun journey.

Action in Call of Duty: Black Ops -- Cold War multiplayer.

Above: Action in Call of Duty: Black Ops — Cold War multiplayer.

Image Credit: Activision

GamesBeat: Can you explain a bit of what it’s like behind the scenes in responding to something like Warzone’s success? It seems like there’s a period of time when the success is so surprising that you have to come up with contingency plans, changing the direction to take advantage of opportunities. At some point you become caught up with it. How has that process happened in the past year? Do you feel like you’re caught up now?

Kostich: I don’t think anyone’s going to ever rest on their laurels or feel caught up. For us it’s just always the pursuit of what else we can do for our fans. To your question, I think we have a good sense of how to operate. When we first launched this thing, we launched seasons. We’re getting smarter with seasons. You’ll see that evolve for us even further in terms of how we navigate through seasons, how we end one and begin another, what we do in the mid-season, how we surprise people throughout. We’re going to get even better on that front for the fans.

That part feels good. We need to hire more resources, but we’re just continuing to focus on innovating, pushing the genre forward, and providing incredible new play experiences for the community.

GamesBeat: How do you deal with things like the differences between the studios? Different game engines, different time frames they focus on. Then all of a sudden in Warzone you’re going to put everything in there. It seems like you may have to shoehorn things that may or may not fit.

Kostich: We’ve been very focused on that in particular. One of the most important concepts for us is to make sure we limit any friction for our community as we go forward. What that means behind the scenes is making sure that from a technology perspective, everything feels seamless to the player. That’s a big focus for us as we move forward, so that as you transition from one experience to the next, as new weapons come in and out of the game, it feels like a solid, continuous play experience that evolves into the future. That’s also come from our development teams working together to make that–as you swap in and out from Warzone or a premium experience in the future, it’s seamless for our community. It’s been another passionate point for our team, to make sure we can provide the best experience possible for our fans as we go forward.

On the narrative front, the Call of Duty universe is super rich with everything we can do. That’s the fun part, taking people on that journey as we move it into the future.

Season Three for Warzone and Black Ops Cold War multiplayer is upon us.

Above: Season Three for Warzone and Black Ops Cold War multiplayer is upon us.

Image Credit: Activision

GamesBeat: It feels like putting Zombies into Warzone–it does tie narratives together. It seems like it might be tough to do that every year, though, to tie narratives together so closely that it’s almost one game with one narrative. Whereas before, some of the freshness came to the franchise because there were different branches going in very different directions. How do you balance some of that? Some players might want something totally different, like World War II or Infinite Warfare, those very different directions.

Kostich: They can be very different. The interesting part about Warzone is that we can, from an event perspective, bring stuff in and out of Warzone to keep it fresh, provide a new experience, and transition to new things. There’s no rule set that says we have to transition to Zombies once and they forever stay. Zombies may come in and out of Warzone. Other events might come in and out of Warzone. We might have special play experiences for our fans as we transition from one place to the other. That’s the real fun part. That’s where the flexibility is for us. The Call of Duty universe is so rich in content and its history of eras and stories and things we do. We think that provides an incredible platform for new, fresh experiences within the Warzone environment for our players.

As I mentioned before, Warzone being the central point of things going on, people understand all the great things that are happening in the franchise. If they want to get a deeper experience with a certain aspect of Call of Duty, we have those premium experiences, which will differ. You’re very familiar with the franchise. You know how they differ very well. They tug a lot of different strings, whether you’re playing Modern Warfare or Black Ops or something historical. It’s great to get those experiences, but we can take parts of those and fuse them into Warzone in the longer term or for a limited time, making that fun and interesting for the community.

GamesBeat: You have a very strong rumor community. There’s a certain group that trades and thrives on that process. What can you do about setting the record straight or otherwise communicating more in that kind of environment? I’ve heard things like, “The guns come out overpowered and then they get nerfed, because that causes people to come back to the new season and pay more.” “Activision doesn’t care about stopping cheaters.” “Activision doesn’t care about file size.” There’s almost a conspiracy theory approach to everything that happens around the game. How do you channel that in a better direction?

Verdansk, the home of Warzone, has been visited by 100 million players. Not so many have come out alive.

Above: Verdansk, the home of Warzone, has been visited by 100 million players. Not so many have come out alive.

Image Credit: Activision

Kostich: There’s two parts to that. One is communication and the other is action. We’ll continue to do a better and better job of communicating with the community very frequently. In terms of action, to some of our points, you talk about the cheating. You’re familiar with this space. Any large-scale free-to-play game gets attacked about those not-good actors who are out there. You’ve probably seen that we’ve banned more than 475,000 accounts now. We have a dedicated security team. We’re investing more resources there to make sure we provide the best possible experience for our fans. We have to take action, and also communicate about that, which we’re going to do.

As far as other aspects of the business, it’s the same way. You talked about file size, for example. That’s an interesting one. When we launched Warzone, our goal was to make the best-looking, best-playing battle royale experience on the planet. I think we accomplished that. With that, though, there’s a bit of a file size that we recognize. We also have a team that’s continuously focused on taking down that footprint for our fans so they can better manage their inventory of games. We’re working on all the things you mentioned very aggressively on behalf of the community, and we’ll continue to do a better job of communicating with them.

As you know, it’s a very small world nowadays. News travels very fast. Sometimes it goes in weird directions for whatever reason. For us it’s about communication and action. At the bottom line, providing the best possible game imaginable for our fans. Across what we’re doing, across console and PC, across mobile, I mentioned this at the beginning, but the franchise has never been better, frankly. We’ve never had more opportunity in front of us. We’re excited, and more than anything we’re thankful for our community and their support. We’re more passionate than ever to surprise and delight them in the future.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

fb image 4338 captis executive search management consulting leadership board services

RuneScape is coming to iOS and Android this summer

fb image 4338 captis executive search management consulting leadership board services

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join GamesBeat Summit 2021 this April 28-29. Register for a free or VIP pass today.


Twenty of years of one of the most long-lived MMORPGs is coming to your pocket. Today, publisher Jagex announced that RuneScape is coming to iOS and Android. It’ll join Old School Runescape on mobile (that one launched in 2018).

Jagex is taking preorders and signups now (it’s been in early access for some time now), but it’s not saying how much it’s selling RuneScape on mobile. It’ll have crossplay and cross-progression with RuneScape on PC. You’ll have access to all your characters, your quests, your gear, etc.

Runescape joins the likes of Genshin Impact and Black Desert as crossplay, cross-progression MMORPGs on PC and mobile. As smartphones grow more powerful, expect to see more games (of many genres) offer such experiences.

Jagex already has the franchise on mobile with Old School Runescape (which is, well, the classic version of the MMO). According to mobile research firm Sensor Tower, the classic version has received an estimated 8.6 million installs since its 2018 launch.

In Early Access, RuneScape Mobile has seen over 2.1 million downloads, according to Jagex, over the past 18 months.

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

fb image 384 captis executive search management consulting leadership board services

Is poor data quality undermining your marketing AI?

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Marketing’s potential to deliver results relies on data quality, but data accuracy, consistency, and validity continue to be a challenge for many organizations. Inconsistent data quality is holding marketing teams back from converting leads into sales, accurately tracking campaign performance, and taking on the larger challenges of optimizing product mix and product/service revenue forecasts.

The latest analytics, Account-Based Marketing (ABM), CRM, marketing automation, and lead scoring tools all provide real-time data capture and analysis. How the tools ensure consistent data quality directly impacts the quality of the AI and machine learning models the tools use.

Inconsistent data drives opportunities away

Marketing teams can’t deliver on their goals with bad data quality. For example, inaccurate prospect data clogs sales pipelines by slowing down efforts to turn marketing qualified leads (MQLs) into sales qualified leads (SQLs).

Problems with data quality increase the odds of failure for AI initiatives such as predictive audience offers and promotions, personalization, AI-enabled chatbots for advanced service, and automated service recovery. A quarter of organizations attempting to adopt AI report an up to a 50% failure rate, IDC said recently. The leading causes of inconsistent data quality in marketing include problems with taxonomy and meta-tagging, lack of data governance, and loss of productivity.

No data consistency

The most common reason AI and ML fail in the marketing sector is that there’s little consistency to the data across all campaigns and strategies. Every campaign, initiative, and program has its unique meta-tags, taxonomies, and data structures. It’s common to find marketing departments with 26 or more systems supporting 18 or more taxonomies, each created at one point in a marketing department’s history to support specific campaigns. O’Reilly’s The State of Data Quality In 2020 survey found that over 60% of enterprises see their AI and machine learning projects fail due to too many data sources and inconsistent data. While the survey was on the organization level, it would not be a stretch to assume the failure rate would be higher within marketing departments, as it’s common to create unique taxonomies, databases, and metatags for each campaign in each region.

primary data quality issues

Above: Marketing departments face a variety of data quality issues. (O’Reilly, State of Data Quality in 2020)

Image Credit: OReilly

The larger, more globally based, and more fragmented a marketing department is, the harder it is to achieve data governance. The O’Reilly State of Data Quality Survey found that just 20% of enterprises publish information about data provenance or data lineage, which are essential tools for diagnosing and resolving data quality issues. Creating greater consistency across taxonomies, data structures, data field definitions, and meta-tags would give marketing data scientists a higher probability of succeeding with their ML models at scale.

Up to a third of a typical marketing team’s time is spent dealing with data quality issues, which has a direct impact on productivity, according to Forrester’s Why Marketers Can’t Ignore Data Quality study. Inaccurate data makes tactical decisions harder to get right, which could impact revenues. Forrester found that 21 cents of every media dollar have been wasted over the last 12 months (as of 2019) due to poor data quality. Taking the time to improve data quality and consistency in marketing would convert the lost productivity to revenue.

Start with change management and data governance

Too often, marketers and the IT teams supporting them rely on data scientists to improve inconsistent data. It’s time-consuming, tedious work and can consume up to 80% or more of the data scientist’s time. It is no surprise that data scientists rate cleaning up data as their least-liked activity.

Instead of asking data scientists to solve the marketing department’s data quality challenges, it would be far better to have the marketing department focus on creating a single, unified content data model. The department should consolidate diverse data requirement needs into a single, unified model with a taxonomy rigid enough to ensure consistency, yet adaptive enough to meet unique campaign needs. Change management makes the marketer’s job easier and more productive because there is a single, common enterprise taxonomy. Data governance is key to solving this problem, and marketing leaders have to be able to explain how improving metadata consistency and content data models fits within the context of each team member’s role. After that, the marketing organization should focus on standardizing across all taxonomies and the systems supporting them.

The bottom line is that inconsistent data quality in marketing impacts the team by jeopardizing new sales cycles and creating confusion in customer relationships. The ability to get AI and ML pilots into production and provide insights valuable enough to change a company’s strategic direction depends on reliable data. Companies will find their marketing campaigns’ future contributions to growth are defined by how the team improves data quality today.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

fb image 4329 captis executive search management consulting leadership board services

Microsoft details the latest developments in machine learning at GTC 21

fb image 4329 captis executive search management consulting leadership board services

This article is part of the VB Lab Microsoft / NVIDIA GTC insight series.


With the rapid pace of change taking place in AI and machine learning technology, it’s no surprise Microsoft had its usual strong presence at this year’s Nvidia GTC event.

Representatives of the company shared their latest machine learning innovations in multiple sessions, covering inferencing at scale, a new capability to train machine learning models across hybrid environments, and the debut of the new PyTorch Profiler that will help data scientists be more efficient when they’re analyzing and troubleshooting ML performance issues.

In all three cases, Microsoft has paired its own technologies, like Azure, with open source tools and NVIDIA’s GPU hardware and technologies to create these powerful new innovations.

Inferencing at scale

Much is made of the costs associated with collecting data and training machine learning models. Indeed, the bill for computation can be high, especially with large projects — into the millions of dollars. Inferencing, which is essentially the application of a trained model, is discussed less often in the conversation about the compute costs associated with AI. But as deep learning models become increasingly complex, they involve huge mathematical expressions and many floating point operations, even at inference time.

Inferencing is an exciting wing of AI to be in, because it’s the step at which teams like Microsoft Azure are delivering an actual experience to a user. For instance, the Azure team worked with NVIDIA to improve the AI-powered grammar checker in Microsoft Word. The task is not about training a model to offer better grammar checking; it’s about powering the inferencing engine that actually performs the grammar checking.

Given Word’s massive user base, that’s a computationally intensive task — one that has comprised billions of inferences. There are two interrelated concerns: one is technical, and the other is financial. To reduce costs, you need more powerful and efficient technology.

Nvidia developed the Triton Inference Server to harness the horsepower of those GPUs and marry it with Azure Machine Learning for inferencing. Together, they help you get your workload tuned and running well. And they support all of the popular frameworks, like PyTorch, TensorFlow, MXNet, and ONNX.

ONNX Runtime is a high-performance inference engine that leverages various hardware accelerators to achieve optimal performance on different hardware configurations. Microsoft closely collaborated with NVIDIA on the TensorRT accelerator integration in ONNX Runtime for model acceleration on Nvidia GPUs. ONNX Runtime is enabled as one backend in Triton Server.

Azure Machine Learning is a managed platform-as-a-service platform that does most of the management work for users. This speaks to scale, which is the point at which too many AI projects flounder or even perish. It’s where technological concerns sometimes crash into the financial ones, and Triton and Azure Machine Learning are built to solve that pain point.

Making ML model training across on-premise and multi-cloud, or hybrid and multi-cloud, easier with Kubernetes

Creating a hybrid environment can be challenging, and the need to scale resource-intensive ML model training can complicate matters further. Flexibility, agility, and governance are key needs.

The Azure Arc infrastructure lets customers with Kubernetes assets apply policies, perform security monitoring, and more, all in a “single pane of glass.” Now, the Azure Machine Learning integration with Kubernetes builds on this infrastructure by extending the Kubernetes API. On top of that, there’s native Kubernetes code concepts like operators and CI/CDs, and an “agent” runs on the cluster and enables customers to do ML training using Azure Machine Learning.

Regardless of a user’s mix of clusters, Azure Machine Learning lets users easily switch targets. Frameworks that the Azure Machine Learning Kubernetes native agent supports include SciKit, TensorFlow, PyTorch, and MPI.

The native agent smooths organizational gears, too. It removes the need for data scientists to learn Kubernetes, and the IT operators who do know Kubernetes don’t have to learn machine learning.

PyTorch Profiler

The new PyTorch Profiler, an open source contribution from Microsoft and Facebook, offers GPU performance tuning for popular machine learning framework PyTorch. The debugging tool promises to help data scientists and developers more efficiently analyze and troubleshoot large-scale deep learning model performance to maximize the hardware usage of expensive computational resources.

In machine learning, profiling is the task of examining the performance of your models. This is distinct from looking at model accuracy; performance, in this case, is about how efficiently and thoroughly a model is using hardware compute resources.

It builds on the existing PyTorch autograd profiler, enhancing it with a high-fidelity GPU profiling engine that allows users to capture and correlate information about PyTorch operations and detailed GPU hardware-level information.

PyTorch Profiler requires minimal effort to set up and use. It’s fully integrated, part of the new Profiler profile module, new libkineto library, and PyTorch Tensorboard Profiler plugin. You can also visualize it all Visual Studio Code. It’s meant for beginners and experts alike, across use cases from research to production, and it’s complementary to Nvidia’s more advanced NSight.

One of PyTorch Profiler’s key features is its timeline tracing. Essentially, it shows CPU and GPU activities and lets users zoom in on what’s happening with each. You can see all the operators that are typical PyTorch operators, as well as more high-level Python models and the GPU timeline.

One common scenario that users may see in the PyTorch Profiler is instances of low GPU utilization. A tiny gap in the GPU visualization represents, say, 40 milliseconds when the GPU was not busy. Users want to optimize that empty space and give the GPU something to do. PyTorch Profiler enables them to drill down and see what the dependencies were and what events preceded that idle gap. They could trace the issue back to the CPU and see that it was the bottleneck; the GPU was sitting there waiting for data to be read by another part of the system.

Examining inefficiencies at such a microscopic level may seem utterly trivial, but if a step is only 150 milliseconds, a 40-millisecond gap in GPU activity is a rather large percentage of the whole step. Now consider that a project may run for hours, or even weeks at a time, and it’s clear why losing such a large chunk of every step is woefully inefficient in terms of getting your money’s worth from the compute cycles you’re paying for.

PyTorch Profiler also comes with built-in recommendations to guide model builders for common problems and possible. In the above example, you may simply need to tweak DataLoader’s number of workers to ensure the GPU stays busy at all times.

Don’t miss these GTX 2021 sessions. Watch on demand at the links below:


VB Lab Insights content is created in collaboration with a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. Content produced by our editorial team is never influenced by advertisers or sponsors in any way. For more information, contact [email protected].

fb image 351 captis executive search management consulting leadership board services

Nhost is an open source Firebase rival backed by GitHub’s founders

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


In a world where technical talent is at a premium, businesses have to tap into the broader technology ecosystem to help build and scale their digital products. In truth, most companies probably don’t care much how their software is constructed — as long as it has all the features and functionality needed to satisfy their target market.

Against this backdrop, Swedish startup Nhost is setting out to expedite the development process with an open source backend-as-a-service (BaaS) platform that lets developers forget about the infrastructure and focus purely on the customer-facing frontend.

With Nhost, companies can automate their entire backend development and cloud infrastructure spanning file storage, databases, user authentication, APIs, and more.

“We remove a considerable amount of ongoing effort, time, and resources for tasks that are not directly related to the product our customers want to build,” Nhost CEO and cofounder Johan Eliasson told VentureBeat. “With Nhost, they can start building their customer-facing products after only one minute.”

To help fund its next stage of growth, Nhost today announced it has raised $3 million in a round of funding led by Nauta Capital, with participation from some prominent angel investors, including GitHub founders Scott Chacon and Tom Preston-Werner and Netlify founders Christian Bach and Mathias Biilmann. Existing investor Antler also participated in the round.

fb image 351 captis executive search management consulting leadership board services

Above: Nhost’s website

Offload

Even the biggest technology giants with the deepest pockets look externally to boost their technology stack. Open source software, for example, allows them to benefit from the scalability of community-driven projects. And using third-party APIs (application programming interfaces) also saves them having to develop every component of their application internally.

Nhost and its backend infrastructure are a different proposition, but the idea is the same — to help companies offload some of their requirements to a third party with domain-specific expertise.

The Stockholm-based startup was created in late 2019 with Johan Eliasson as the sole founder, though he soon realized that building what is effectively an open source alternative to Google’s Firebase would be a tall order. After he met software engineer Nuno Pato at a startup accelerator program in early 2020, the duo officially became cofounders.

The global backend-as-a-service market was pegged at $1.6 billion in 2020, according to some estimates, a figure that’s projected to rise to nearly $8 billion by 2027. Existing players such as Firebase claim major clients like Alibaba, the New York Times, Duolingo, Venmo, and Trivago, highlighting that it’s not just cash-strapped startups that want to outsource their backend management.

Open for business

One of Nhost’s major selling points is that it’s an open source project, meaning companies can do with it as they please, though Eliasson notes that the main benefits of its open source status are around “collaboration and transparency.”

There are, of course, other players in the open source BaaS space, such as Back4App, Parse, Kinvey, and Kuzzle. But Nhost considers itself distinct on a number of grounds, chief among them the scope of its single-platform offering.

Nhost offers all the required building blocks for modern software, including a PostgreSQL database, real-time GraphQL API (which is available for most major front-end frameworks, such as React, Flutter, and Vue), authentication, storage, and serverless functions that allow companies to deploy custom code. On top of that, Nhost offers a managed cloud incarnation with plans spanning hobbyists, professionals, and enterprises.

“Our tech stack offers a unique combination of open source tools we haven’t seen anywhere else, plus a tremendous focus on the developer experience,” Eliasson said. “We believe that building robust and highly scalable applications should be fun, fast, and easy for everyone.”

Target market

For now, Eliasson said most Nhost customers are “indie-hackers, startups, and agencies,” and it counts around 110 paying customers. However, it has aspirations on the enterprise front, something that its seed round should help support.

“Our approach is bottom-up — indies, developers, startups, small and medium-sized teams first,” Eliasson explained. “Enterprise will have its own sales channel when the right time comes.”

Nhost is in the process of rolling out enterprise-grade features, including support for single sign-on (SSO), audit logs, and ISO certificates, which have “already been requested by larger customers,” according to Eliasson.

It’s easy to see why Nhost could prove popular for developer teams looking to spin up a quick prototype or minimal viable product (MVP) given that it removes much of the friction involved in launching even a semi-functional app. However, it’s worth noting that prototypes or MVPs are how most modern software starts out — which puts Nhost in a favorable position when the time comes for developers to ramp things up.

“Nhost really shines for MVPs because the stack we chose makes that easy,” Eliasson explained. “That is important for us because there is very low friction for developers to start building, while the platform is scalable, flexible, and performant enough for when their apps get successful and need to scale.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

fb image 351 captis executive search management consulting leadership board services

Nhost is an open source Firebase rival backed by GitHub’s founders

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


In a world where technical talent is at a premium, businesses have to tap into the broader technology ecosystem to help build and scale their digital products. In truth, most companies probably don’t care much how their software is constructed — as long as it has all the features and functionality needed to satisfy their target market.

Against this backdrop, Swedish startup Nhost is setting out to expedite the development process with an open source backend-as-a-service (BaaS) platform that lets developers forget about the infrastructure and focus purely on the customer-facing frontend.

With Nhost, companies can automate their entire backend development and cloud infrastructure spanning file storage, databases, user authentication, APIs, and more.

“We remove a considerable amount of ongoing effort, time, and resources for tasks that are not directly related to the product our customers want to build,” Nhost CEO and cofounder Johan Eliasson told VentureBeat. “With Nhost, they can start building their customer-facing products after only one minute.”

To help fund its next stage of growth, Nhost today announced it has raised $3 million in a round of funding led by Nauta Capital, with participation from some prominent angel investors, including GitHub founders Scott Chacon and Tom Preston-Werner and Netlify founders Christian Bach and Mathias Biilmann. Existing investor Antler also participated in the round.

fb image 351 captis executive search management consulting leadership board services

Above: Nhost’s website

Offload

Even the biggest technology giants with the deepest pockets look externally to boost their technology stack. Open source software, for example, allows them to benefit from the scalability of community-driven projects. And using third-party APIs (application programming interfaces) also saves them having to develop every component of their application internally.

Nhost and its backend infrastructure are a different proposition, but the idea is the same — to help companies offload some of their requirements to a third party with domain-specific expertise.

The Stockholm-based startup was created in late 2019 with Johan Eliasson as the sole founder, though he soon realized that building what is effectively an open source alternative to Google’s Firebase would be a tall order. After he met software engineer Nuno Pato at a startup accelerator program in early 2020, the duo officially became cofounders.

The global backend-as-a-service market was pegged at $1.6 billion in 2020, according to some estimates, a figure that’s projected to rise to nearly $8 billion by 2027. Existing players such as Firebase claim major clients like Alibaba, the New York Times, Duolingo, Venmo, and Trivago, highlighting that it’s not just cash-strapped startups that want to outsource their backend management.

Open for business

One of Nhost’s major selling points is that it’s an open source project, meaning companies can do with it as they please, though Eliasson notes that the main benefits of its open source status are around “collaboration and transparency.”

There are, of course, other players in the open source BaaS space, such as Back4App, Parse, Kinvey, and Kuzzle. But Nhost considers itself distinct on a number of grounds, chief among them the scope of its single-platform offering.

Nhost offers all the required building blocks for modern software, including a PostgreSQL database, real-time GraphQL API (which is available for most major front-end frameworks, such as React, Flutter, and Vue), authentication, storage, and serverless functions that allow companies to deploy custom code. On top of that, Nhost offers a managed cloud incarnation with plans spanning hobbyists, professionals, and enterprises.

“Our tech stack offers a unique combination of open source tools we haven’t seen anywhere else, plus a tremendous focus on the developer experience,” Eliasson said. “We believe that building robust and highly scalable applications should be fun, fast, and easy for everyone.”

Target market

For now, Eliasson said most Nhost customers are “indie-hackers, startups, and agencies,” and it counts around 110 paying customers. However, it has aspirations on the enterprise front, something that its seed round should help support.

“Our approach is bottom-up — indies, developers, startups, small and medium-sized teams first,” Eliasson explained. “Enterprise will have its own sales channel when the right time comes.”

Nhost is in the process of rolling out enterprise-grade features, including support for single sign-on (SSO), audit logs, and ISO certificates, which have “already been requested by larger customers,” according to Eliasson.

It’s easy to see why Nhost could prove popular for developer teams looking to spin up a quick prototype or minimal viable product (MVP) given that it removes much of the friction involved in launching even a semi-functional app. However, it’s worth noting that prototypes or MVPs are how most modern software starts out — which puts Nhost in a favorable position when the time comes for developers to ramp things up.

“Nhost really shines for MVPs because the stack we chose makes that easy,” Eliasson explained. “That is important for us because there is very low friction for developers to start building, while the platform is scalable, flexible, and performant enough for when their apps get successful and need to scale.”

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform 2021: Learn More
  • networking features, and more

Become a member

fb image 81 captis executive search management consulting leadership board services

Comcast speed test shows 4Gbps upsteam and downstream over cable

.article-content .boilerplate-after {
background-color: #F5F8FF;
padding: 30px;
border-left: 4px solid #000E31;
line-height: 2em;
margin-top: 20px;
margin-bottom: 20px;
}

.article-content .membership-link {
background-color: #000E31;
color: white;
padding: 10px 30px;
font-family: Roboto, sans-serif;
text-decoration: none;
font-weight: 700;
font-size: 18px;
display: inline-block;
}

.article-content .membership-link:hover {
color: white;
background-color: #0B1A42;
}

.article-content .boilerplate-after h3 {
margin-top: 0;
font-weight: 700;
}

.article-content .boilerplate-after ul li {
margin-bottom: 10px;
}

@media (max-width: 500px) {
.article-content .boilerplate-after {
padding: 20px;
}
}

Join Transform 2021 this July 12-16. Register for the AI event of the year.


Comcast, the largest U.S. provider of gigabit broadband Internet services, has demonstrated internet speeds greater than 4 gigabits per second (Gbps) in both directions over a cable network.

The test was conducted on Broadcom’s full duplex DOCSIS 4.0 chip, which will support future multi-gigabit upload and download speeds. By full duplex, it means that the upstream speed is the same as the downstream speed. Normally, most people can get access to a gigabit per second download speeds, but uploading crawls at maybe 20 megabits per second.

The test is part of Comcast’s long-term plan to reach 10G, or 10 gigabits a second full duplex, over its hybrid-fiber coaxial (HFC) network. The HFC network uses the blazing fast fiber-optic network technology as well as coaxial cables that come into our homes.

Supported by the research arm of the cable industry, Cable Labs, Comcast is creating its 10G Platform as a multi-year, global technology initiative that will dramatically increase speed and capacity over HFC networks. During the pandemic, this kind of internet speed is critical, as Comcast saw its traffic rise 32% in 2020.

Behind the trial

Comcast has been laying a lot of cable.

Above: Comcast has been laying a lot of cable.

Image Credit: Comcast

This trial begins to lay the groundwork for network operators like Comcast to deliver multigigabit download and upload speeds over  connections that are already installed in hundreds of millions of homes worldwide. Cable operators in the U.S. have already installed networks that pass 85 percent of U.S. homes.

The Broadcom chip is expected to become the world’s first production silicon to be developed using the DOCSIS 4.0 Full Duplex standard, which represents an evolutionary leap forward in the ability to deliver ultra-fast speeds over HFC networks. One of the most important breakthroughs in the DOCSIS 4.0 standard is the ability to use network spectrum more efficiently, allowing operators to dramatically increase upstream speeds without sacrificing downstream spectrum to do so, Comcast said.

A key advantage of DOCSIS 4.0 Full Duplex is that it establishes a foundation for operators to deliver multigigabit speeds over their existing networks to the connections already in hundreds of millions of homes around the world, without the need for massive digging and construction projects.

Comcast technologists in Philadelphia and Denver conducted the test by installing the Broadcom SOC in a simulated network environment to track the performance of its Full Duplex DOCSIS features – including echo cancellation and overlapping spectrum – which combine to support substantial improvements in network throughput. In the test environment, the research team demonstrated the ability of the system-on-chip (SoC) to deliver upstream and downstream throughputs of greater than 4 gigabits per second (Gbps). Future optimization is expected to drive even greater capacity.

Comcast’s vision

In

Above: Inside Comcast’s CTC in Philadelphia.

Image Credit: Comcast

Elad Nafshi, senior vice president of next generation access networks at Comcast Cable, said in an email to GamesBeat that the performance of the Broadcom chip exceeded expectations. He said the SoC was built by Broadcom, and the test was designed and executed by Comcast network engineers in Philadelphia and Denver, with the support of technology partners at Broadcom.

“More broadly speaking, the developments we’re seeing today on 10G are the result of global collaboration between operators, technology makers and standards bodies,” Nafshi said.

Last October, Comcast technologists were able to deliver 1.25 Gig symmetrical speeds over a live, all-digital network by leveraging advances in Distributed Access Architecture, Remote PHY digital nodes, and a cloud-based virtualized cable modem termination system platform (vCMTS).

Even as Comcast works to test and deploy Full Duplex DOCSIS to enable multigigabit upload and download speeds in the future, the company is leveraging the technologies from the October trial, along with DOCSIS 3.1 in the upstream, to increase speed and capacity in the near term. In my area, I can pay for a gigabit downstream connection.

“Comcast has been working to develop the technologies that power 10G since well before it was formally introduced in January 2019,” he said. “[We] played a key role, along with industry partners, in developing the DOCSIS 4.0 Full Duplex standard. The lab test itself was the result of several weeks of construction, preparation and design by Comcast network engineers in Philadelphia and Denver.”

He added, “We don’t have any news to share at this point about new product and service offerings, but we’ve been excited and impressed by the pace of innovation with technologies like distributed access architecture, virtualization and DOCSIS 4.0. We’re very confident that the mix of speeds we make available today are more than fast enough to meet and exceed our customers current needs. We’re continuing to move forward with testing and development of this technology, because we know the future will bring even greater demand, and we want to be ready for whatever comes.”

GamesBeat

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it.

How will you do that? Membership includes access to:

  • Newsletters, such as DeanBeat
  • The wonderful, educational, and fun speakers at our events
  • Networking opportunities
  • Special members-only interviews, chats, and “open office” events with GamesBeat staff
  • Chatting with community members, GamesBeat staff, and other guests in our Discord
  • And maybe even a fun prize or two
  • Introductions to like-minded parties

Become a member

venturebeat logo captis executive search management consulting leadership board services

Battlefield mobile game coming in 2022 from EA and Industrial Toys