Capital One begins journey as a software vendor with the release of Critical Stack Beta

Capital One begins journey as a software vendor with the release of Critical Stack Beta

If every company is truly a software company, Capital One is out to the prove it. It was one of the early users of Critical Stack, a tool designed to help build security into the container orchestration process. In fact, it liked it so much it bought the company in 2016, and today it’s releasing Critical Stack in Beta.

This is a critical step toward becoming a commercial product, giving the bank its first entree into software selling.

Capital One is embracing modern applications delivery methods like containerization, and it needed a tool specifically tuned to the security requirements of a financial services company. That’s what Critical Stack purports to give it, and they liked it so much, they thought others who required a similar level of security would too.

Critical Stack is compatible with Kubernetes, the popular container orchestration tool, but it’s been designed to provide a higher level of security than the base product, while giving large institutions like banks a packaged approach to container orchestration.

“One of the many strengths of Kubernetes is its rapid development cycle. You understand how challenging that can be to keep up with that moving target. We have an orchestration layer that has an abstraction away from that. Critical Stack is a stand-alone tool within the ecosystem of tools compatible with Kubernetes,” Liam Randall, Capital One’s senior director of software engineering and Critical Stack co-founder told TechCrunch.

Critical Stack does everything you would expect a Kubernetes distribution to do including managing the container delivery and lifecycle management, but it’s specifically designed to allow operations to automate security and compliance policies around the containers, something banks and other highly regulated businesses need to do.

The company also concentrated on putting that kind of functionality in an interface that’s easy to use.

Photo: Critical Stack

While the company isn’t open sourcing this tool, they believe by selling it, they can get a similar set of benefits. “When you think about a lot of the great platforms, the best lessons learned come from working with other partners,” Randall said. While he and his team found a broad set of use cases internally, they felt that getting the product into the hands of others would only help enhance it — and it doesn’t hurt they could make some money doing it.

Featured Image: Smith Collection/Gado/Getty Images

Source: https://techcrunch.com/2017/11/21/capital-one-begins-journey-as-a-software-vendor-with-the-release-of-critical-stack-beta/?ncid=rss&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+techcrunchIt+%28TechCrunch+IT%29

Powered by WPeMatico

HPE adds recommendations to AI tech from Nimble acquisition

HPE adds recommendations to AI tech from Nimble acquisition

When HPE acquired Nimble Storage in March for a cool billion dollars, it knew it was getting some nifty flash storage technology. But it also got Nimble’s InfoSight artificial intelligence capabilities that not only monitored the underlying storage arrays, but all of the adjacent datacenter technology.

Today, the company announced it has enhanced that technology to provide recommendations based on the body of data from Nimble’s 10,000 customers.

Bill Philbin, HPE SVP and GM for storage and big data solutions, says when companies are running applications, they need to know when things are going to go wrong before it happens. It’s not unlike sensors telling a factory owner that the machine is going to break soon. It allows you to take action proactively on your terms before something breaks down.

“As customers look how to build datacenters more efficiently, one of the biggest areas they struggle with is how to provide an optimal applications experience for consumers,” he said. The InfoSight tool can give them insight into how to optimize the hardware these applications are running on.

The company is not only looking at the individual customer’s datacenter to make recommendations. It’s using the entire Nimble customer base (and eventually extending that to other HPE storage products) to understand what issues trigger problems. When it sees a similar set of issues across multiple customers, the system learns the optimal way to fix that and can make the appropriate recommendation on how to repair or prevent a potential problem.

The way it works is the company collects millions of data points from the storage arrays, hypervisors and virtual machines under its purview, then encrypts and anonymizes the data and sends it to the InfoSight cloud analytics engine, where the data gets processed in real time and presented to customers. The customers log into the InfoSight portal to see how the system is doing at any given moment and get recommendations on how to keep the system stable and running smoothly.

In addition to the recommendation enhancements, they are also announcing InfoSight for 3Par, another storage solution that HP bought in 2010. This gives 3Par customers access to a similar solution for its line of storage products.

Philbin say adding this technology to 3Par is just a starting point. The company wants to eventually have InfoSight running across the entire storage product set to provide a similarly rich set of information and recommendations for each group of storage products in the HPE family.

Featured Image: Getty Images

Source: https://techcrunch.com/2017/11/21/hpe-enhances-ai-tech-from-nimble-acquisition-with-recommendations/?ncid=rss&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+techcrunchIt+%28TechCrunch+IT%29

Powered by WPeMatico

New Venzee tool brings data transformation and validation to your blockchain project

New Venzee tool brings data transformation and validation to your blockchain project

If the blockchain is going to be an immutable record, you need to start with clean data. The question is, how do you get clean data into a blockchain database to begin with. It’s kind of a quandary for use cases not starting with a green field, but Venzee, a startup that has been helping customers clean up their retail supply chain data to share with large vendors, thinks it has an answer.

Venzee CEO Kate Hiscox sees this as a data transformation problem, something her company has been working to solve for the past three years in the retail space. That sector is awash in spreadsheets that have to be manually prepared to share with big retailers and their database requirements, she says.

Venzee acts as as a data transformation layer, taking your spreadsheet information in its current form and helping to convert that data into a format that the big retailers like Amazon and Walmart can use. It’s bringing a level of automation, she says has been missing from this industry, one she has worked in for 18 years.

If you think about the blockchain, it’s really just a fancy new database, one where you have to ensure that the data going in is correct or you face having bad data in a system that is supposedly immutable and irrefutable. That would be a real problem.

Hiscox’s company has created a product called Mesh to deal with this. It’s essentially a data transformation tool for the blockchain that can work for a retail supply chain kind of scenario or other blockchain data transformation requirement and prepare the data to be moved in the appropriate format while providing a data review phase to ensure that it’s accurate, she says.

This involves a three-phase process. In the first, the data is validated, and this is a key phase for blockchain data. In the second it’s transformed to work in the new format and finally it’s transferred.

Say for example, you were moving data from a land registry to the blockchain. It would require moving the data from its current format, then validating the ownership records before putting it on the blockchain. This tool offers a path to doing that, but it also raises questions about the validation process.

If the blockchain is supposed to a trust-based system, how do you ensure that you aren’t creating a means to alter data instead of validating it? These are questions that need to be answered, but this tool is about getting the data ready and moving it. The system will require checks and balances beyond that.

This tool is currently available in private Beta and will be released in the first half of 2018.

Featured Image: allanswart/Getty Images

Source: https://techcrunch.com/2017/11/16/venzee-middleware-tool-could-help-bring-clean-data-to-your-blockchain-project/?ncid=rss&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+techcrunchIt+%28TechCrunch+IT%29

Powered by WPeMatico

Facebook open sources Open/R distributed networking software

Facebook open sources Open/R distributed networking software

Facebook is no stranger when it comes to open sourcing its computing knowledge. Over the years, it has consistently created software and hardware internally, then transferred that wisdom to the open source community to let them have it. Today, it announced it was open sourcing its modular network routing software called Open/R, as the tradition continues.

Facebook obviously has unique scale needs when it comes to running a network. It has billions of users doing real-time messaging and streaming content at a constant clip. As with so many things, Facebook found that running the network traffic using traditional protocols had its limits and it needed a new way to route traffic that didn’t rely on the protocols of the past,

“Open/R is a distributed networking application platform. It runs on different parts of the network. Instead of relying on protocols for networking routing, it gives us flexibility to program and control a large variety of modern networks,” Omar Baldonado, Engineering Director at Facebook explained.

While it was originally developed for Facebook’s Terragraph wireless backhaul network, the company soon recognized it could work on other networks too including the Facebook network backbone, and even in the the middle of Facebook network, he said.

Given the company’s extreme traffic requirements where the conditions were changing so rapidly and was at such scale, they needed a new way to route traffic on the network. “We wanted to find per application, the best path, taking into account dynamic traffic conditions throughout the network,” Baldonado said.

But Facebook also recognized that that it could only take this so far internally, and if they could work with partners and other network operators and hardware manufacturers, they could extend the capabilities of this tool. They are in fact working with other companies in this endeavor including Juniper and Arista networks, but by open sourcing the software, it allows developers to do things with it that Facebook might not have considered, and their engineering team finds that prospect both exciting and valuable.

It’s also part of a growing trend at Facebook (and other web scale companies) to open up more and more of the networking software and hardware. These companies need to control every aspect of the process that they can, and building software like this, then giving it to the open source community lets others bring their expertise and perspective and improve the original project.

“This goes along with movement toward disaggregation of the network. If you open up the hardware and open up the software on top of it, it benefits everyone,” Baldonado said.

Featured Image: Getty Images

Source: https://techcrunch.com/2017/11/15/facebook-open-sources-open-r-distributed-networking-software/?ncid=rss&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+techcrunchIt+%28TechCrunch+IT%29

Powered by WPeMatico

Two compliance companies merge to build a $100M firm

Two compliance companies merge to build a $100M firm

Once upon a time there were two compliance companies. Smarsh was owned by Los Angeles-based private equity firm, K1 Investment Management. It worked with mostly SMBs. Another called Actiance worked with larger companies like the world’s biggest banks. This is the story of how K1 is bringing these two companies together.

Both companies are focused on archiving and compliance around communications. Originally that meant email, but over time it’s shifted to social, chat, mobile and other forms you see in the modern enterprise. Both help companies archive and organize this correspondence, and if there is a legal action, create a workflow to help compliance and legal get at the information law enforcement, lawyers or regulators are asking to see.

According to IDC, the market for governance, risk and compliance, which is where these two companies fall, is projected to reach almost $12 billion by 2021. Big enterprise companies in this market include the usual suspects like IBM, Oracle, OpenText, HPE and others.

When you have two companies doing fairly decently in the same sector, and you’re a PE firm, the math and logic says if you put them together, you could have a bigger more successful company. That’s basically the reason behind K1 going out and buying Actiance, which was founded in 1998 and raised over $43 million. K1 already owned Smarsh and by merging the two companies, they believe they could generate $100M in annual revenue. (K1 did not disclose the terms of the deal.)

Each company has been growing 30 percent, year over year alone, but does combining them mean they can continue that level of growth together? We are about to find out.

As Stephen Marsh sees it, who is founder and CEO at Smarsh, when you combine the two companies, you won’t find a lot of overlap. “I think when we look at the combination, there are complementary technologies and customer bases and a wealth of resources in the combined organization that will enable us to cover more markets as we sell and market our solutions,” he said.

Kailash Ambwani, CEO at Actiance not surprisingly saw it similarly. “If we look at active compliance and capture, we have real-time mitigation around social media that’s applicable to Smarsh’s customer base. Smarsh has capabilities around mobile and voice,” he said.

Of course, nothing is ever really that simple when combining companies in this manner. While the two CEOs said they would be putting the two product sets together at some point, for the short term at least, they are operating as separate companies.

Eventually, however, K1 is likely going to want to create some efficiencies inside the combined organization, and that could mean layoffs in redundant positions. The two product sets will also presumably have to become one and operate on a single platform. It’s unclear what impact the merging will have on existing customers, although Ambwani says they are talking to customers now to put them at ease.

Marsh and Ambwani (and K1) are hoping that one combined company is better than two. That idea will be put to the test in the coming months.

Featured Image: Getty Images

Source: https://techcrunch.com/2017/11/15/two-compliance-companies-are-merging-to-build-a-firm-that-could-generate-100m-in-revenue/?ncid=rss&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+techcrunchIt+%28TechCrunch+IT%29

Powered by WPeMatico

Microsoft’s period of congenial cooperation could be over

Microsoft’s period of congenial cooperation could be over

A couple of years ago while a guest of Marc Benioff on stage at Salesforce’s Dreamforce customer conference, Microsoft CEO Satya Nadella said something that seemed to signal a new period of amicable cooperation for his company. Several pieces of evidence seem to suggest that the period of friendly cooperation that was in full bloom in 2015 could be over, and not just with Salesforce.

At the time Nadella said something rather profound about the need for big brands to cooperate in the age of the cloud: “It is incumbent upon us, especially those of us who are platform vendors to partner broadly to solve real pain points our customers have,” Nadella said in 2015.

When you looked at the comment against the backdrop of the time, it appeared to be a giant signal that Microsoft was open to forging new agreements with competitors that would be mutually beneficial to the companies involved, and would help customers solve those real pain points he alluded to.

Essentially, Nadella was stating the obvious that in the age of the cloud, companies needed to work together more than ever before because customers were demanding it. Yet even at that time, Nadella made it clear his company fully intended to compete hard within markets against Salesforce and everyone else — the cooperation only went so far — but he saw an opportunity for his company by playing the role of affable partner.

This was in stark contrast to the model which Bill Gates and Steve Ballmer followed. Back then, it was more of a battle of large companies with full stacks trying to lock customers into their computing approach. In that world, working together wasn’t a desirable goal, which is why Nadella’s more conciliatory tone was so surprising to hear in 2015.

The same went for long-time rival Apple. After years of going at it with Apple, Microsoft was looking to soften things a bit. Perhaps Tim Cook put it best at the partnership announcement when he said, “Apple and Microsoft can partner on more things than we can compete on, and that is what the customer wants…Office on the Mac is a force. Partnering with Microsoft is great for our customers and that’s why we do it,” Cook told the audience at BoxWorks in 2015 when that spirit was in its full glory

By 2017, however, it has become increasingly clear that the message we should have listened to wasn’t the cooperation part, but the fact the Microsoft would compete hard in markets. As they move forward, Microsoft’s softer side under Nadella appears to be hardening a bit. The tone has shifted and gotten a bit harsher and they are as he told us, competing hard.

When the company beat out Salesforce last year for some CRM business at HP, Microsoft cloud head, Scott Guthrie couldn’t hide his competitive glee when he called the deal a “Salesforce takeout.” Suddenly the two firms were competing a bit more fiercely, the tone was getting a bit harsher and the time for nice talk and smiles was over.

On stage last week at Dreamforce, while announcing a deal with his new bestie Diane Greene, head of Google Cloud, Benioff took a swipe right back at Microsoft’s flagship Office product. “We have 30,000 users on G Suite, and have for a very long time. Getting off of Microsoft Office was probably one of the best decisions we ever made,” Benioff said. (Who says the enterprise is boring?)

Meanwhile, last week at a talk in India, Nadella told two Indian journalists using iPads that they should get “real computers.” It was said in a joking manner, but it was also clearly a swipe at Apple too. His company’s hardware is a real computer, whereas Apple is what? A toy computer? You can fill it in, I guess.

The ad campaigns over the last several years have also taken aim at Apple, pointing out the things that computers like the Microsoft Surface Pro can do, that Apple computers can’t. Of course, it’s one thing for an ad to take aim at a rival, it’s another when the CEO does it.

Even as they continue to use harsher language regarding competitors, Microsoft is still finding ways to interoperate with rivals, and that’s not going away. At the same time, Microsoft has become a significant contributor to the open source community under Nadella (see herehere  and here as examples) and that is unlikely to change either.

Look, you don’t expect competing companies to join hands around the campfire singing Kumbaya, but there clearly has been a shift in tone over the last couple of years. Now, it appears that while Microsoft and its technology industry rivals are still looking for ways to make the products work together for the sake of customers, maybe they are doing so now a bit more begrudgingly.

Featured Image: Bloomberg/Getty Images

Source: https://techcrunch.com/2017/11/14/microsofts-period-of-congenial-cooperation-could-be-over/?ncid=rss&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+techcrunchIt+%28TechCrunch+IT%29

Powered by WPeMatico

Google Cloud Spanner update includes SLA that promises less than five minutes of downtime per year

Google Cloud Spanner update includes SLA that promises less than five minutes of downtime per year

Cloud Spanner, Google’s globally distributed cloud database got an update today that includes multi-region support, meaning the database can be replicated across regions for lower latency and better performance. It also got an updated Service Level Agreement (SLA) that should please customers.

The latter states Cloud Spanner databases will have 99.999% (five nines) availability, a level of downtime that translates into less than five minutes per year, according to Cloud Spanner product manager, Deepti Srivastava.

“We have engineered the system to be highly available and efficient and we expect service to be able to provide that,” she said. It’s worth noting that before Google packaged Cloud Spanner as a service, it used it in-house to run products like AdWords. From Google’s perspective, if AdWords goes down, it’s not making money, so it was engineered to stay running. Today, many of its popular services are running on Cloud Spanner.

“It’s been battle tested with mission-critical application that Google runs,” Srivastava explained.

But the product lacked support across multiple regions, meaning you could only house a Cloud Spanner database within a single location. That changed today with the announcement of multi-region support, which means that companies can put the database closer to users. That should result in a more responsive experience with lower latency.

When Google announced the Beta of Cloud Spanner earlier this year, it sounded almost magical. It is a database that gives developers the transactional consistency of a SQL database with the scalability of the the NoSQL variety. It is a rare combination and companies like Evernote and Marketo are using Cloud Spanner today.

The company claims you can be up and running in 4-clicks, but in reality if you are migrating an existing database to Cloud Spanner, it could be more complex. Srivastava says it really depends on the type of system. Obviously, those companies starting with a brand new application are going to get going faster than those rearchitecting an existing database system to work on Cloud Spanner, she said.

Featured Image: Getty Images

Source: https://techcrunch.com/2017/11/14/google-cloud-spanner-update-include-sla-that-promises-less-than-five-minutes-of-downtime-per-year/?ncid=rss&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+techcrunchIt+%28TechCrunch+IT%29

Powered by WPeMatico

Dropbox partners with Autodesk to help users collaborate on large design files

Dropbox partners with Autodesk to help users collaborate on large design files

Dropbox announced a couple of products today to make it easier for Autodesk users to access and share large design files. The products include an integrated desktop app for opening and saving Autodesk files stored in Dropbox and an app for viewing design files without the need for owning Autodesk.

These products are long overdue given that Dropbox’s Ross Piper, who is head of ecosystem and developer platforms, says they have 1.5 billion (with a B) Autodesk files stored in Dropbox with 85 million being added every month, an astonishing number considering the size and complexity of these files. But it is precisely because they are large and complex that a cloud storage solution is a compelling idea.

The companies decided to partner to help make working with these files an easier and more streamlined undertaking.

The Dropbox desktop app, which will be available starting today, enables Autodesk users to open and save .dwg design files in the cloud directly from the AutoCad application. Users simply open these files directly in AutoCad, and they are pulled from Dropbox, and open as normal. When users are finished working on the files, they are saved back to Dropbox automatically.

Dropbox integration directly in AutoDesk application. Photo: Dropbox

In addition, Dropbox also announced a native viewer app, which is coming soon, which will enable Autodesk users to share design files with users who don’t own the Autodesk software. What’s more, these users will be able to comment on the files, making it easier for architects and project managers to share changes even when contractors, customers and other interested parties don’t own the core product.

For instance, you can look at an architect’s drawing and select a room or area, then comment specifically about that area.

Photo: Dropbox

Users can download these new tools from Autodesk’s AutoCad App Store, install them and they are good to go.

This announcement is part of a broader play by Dropbox to have third-party partners like Autodesk integrate the Dropbox product more directly into business applications where people are doing work, rather than having to open Dropbox explicitly to grab these files.

It’s worth noting that Box has had a similar partnership with Autodesk in place for a couple of years.

Source: https://techcrunch.com/2017/11/14/dropbox-partners-with-autodesk-to-help-users-collaborate-on-large-design-files/?ncid=rss&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+techcrunchIt+%28TechCrunch+IT%29

Powered by WPeMatico

The CNCF just got 36 companies to agree to a Kubernetes certification standard

The CNCF just got 36 companies to agree to a Kubernetes certification standard

The Cloud Native Computing Foundation (CNCF) announced today that 36 members have agreed to a set of certification standards for Kubernetes, the immensely popular open source container orchestration tool. This should make it easy for users to move from one version to another without worry, while ensuring that containers under Kubernetes management will behave in a predictable way.

The group of 36 is agreeing to a base set of APIs that have to underly any version of Kubernetes a member creates to guarantee portability. Dan Kohn, executive director at CNCF, says that they took a subset of existing Kubernetes project APIs, which are treated as a conformance test that the members who have signed on, are guaranteeing to support. In practice this means that when you spin up a new container, regardless of who creates the version of Kubernetes, it will behave in a consistent way, he said.

Kohn said that the organization has been able to bring together many of the industry’s biggest names onboard. “We are thrilled with the list of members. We are confident that this will remain a single product going forward and not fork,” he said.

Forking is the act where some companies break off on their own in an open source project, creating a new and possibly incompatible version of the software. The CNCF wanted to ensure this didn’t happen, and so apparently did many of its powerful members including Microsoft, Red Hat, Alibaba, Oracle, Google and IBM along with many others.

AWS, the biggest force in public cloud computing was not among the companies signing, but the CNCF says this is simply because the company has yet to create its own version of Kubernetes (although it supports Kubernetes clusters running on AWS). When AWS joined the CNCF in August, it was a major proof point that the CNCF and Kubernetes had arrived.

Make no mistake, this a huge and somewhat miraculous occurrence to have this many diverse technology companies agree to anything, but Kohn  says most of the organization came together fairly quickly around the forking concern.

“Kubernetes is skyrocketing and everyone is adopting it. When you have a high level of engagement and adoption, there is a concern whether the project is going to fork. If I have an app that works on one version, will it work on another one,” Kohn told TechCrunch.

Kubernetes has indeed become a defacto standard in the last year with just about every big name in tech joining the CNCF. Today’s announcement is about bringing some level of discipline to a growing project and it’s a significant step forward in the maturation of Kubernetes as an open source project.

Source: https://techcrunch.com/2017/11/13/the-cncf-just-got-36-companies-to-agree-to-a-kubernetes-certification-standard/?ncid=rss&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+techcrunchIt+%28TechCrunch+IT%29

Powered by WPeMatico

IBM makes 20 qubit quantum computing machine available as a cloud service

IBM makes 20 qubit quantum computing machine available as a cloud service

IBM has been offering quantum computing as a cloud service since last year when it came out with a 5 qubit version of the advanced computers. Today, the company announced that it’s releasing 20-qubit quantum computers, quite a leap in just 18 months. A qubit is a single unit of quantum information.

The company also announced that IBM researchers had successfully built a 50 qubit prototype, which is the next milestone for quantum computing, but it’s unclear when we will see this commercially available.

While the earliest versions of IBM’s quantum computers were offered for free to build a community of users, and help educate people on programming and using these machines, today’s announcement is the first commercial offering. It will be available by the end of the year.

Quantum computing is a difficult area of technology to understand. Instead of being built on machines interpreting zeroes and ones in on/off states, quantum computers can live in multiple states. This creates all kinds of new programming possibilities and requires new software and systems to build programs that can work with this way of computing.

Dario Gil, IBM Research VP of AI and IBM Q, says the increased number qubits is only part of the story. The more Qubits you deal with, the more complex the qubit interactions become because they interact with one another in a process called entanglement. If you have more qubits, but there is a high error rate as they interact, then they might not be any more powerful than 5 qubit machine with a lower error rate. He says that IBM researchers have managed to achieve the higher qubit number with low error rates, making them highly useful to researchers. “We have more qubits and less errors, which is combined to solve more problems,” Gil said.

The other issue that comes into play when dealing with quantum states is that they tend to exist for a short period of time in a process known as coherence. It basically means that you only have a brief window of time before the qubits revert to a classical computing state of zeroes and ones. To give you a sense of how this coherence has been progressing, it was just a few nanoseconds when researchers started looking at this in the late 90s. Even as recently as last year, they were able to achieve coherence times of 47 and 50 microseconds for the 5 qubit machines. Today’s quantum machines are in the 90 microsecond range. While that doesn’t sound like much, it’s actually a huge leap forward.

All of these variables make it difficult for a programmer to build a quantum algorithm that can achieve something useful without errors and before it reverts to a classical state, but that doesn’t take away from just how far researchers have come in recent years, and how big today’s announcement is in the quantum computing world.

The ultimate goal of quantum computing is a fault tolerant universal system that automatically fixes errors and has unlimited coherence. “The holy grail is fault-tolerant universal quantum computing. Today, we are creating approximate universal, meaning it can perform arbitrary operations and programs, but it’s approximating so that I have to live with errors and a [limited] window of time to perform the operations,” Gil explained.

He sees this is an incremental process and today’s announcement is a step along along the path, but he believes that even what they can do today is quite powerful. With today’s release and the the improvements that IBM made to the QISKit, a software development kit (SDK) to help companies understand how to program quantum computers, they can continue to advance the technology. It’s not going to happen overnight, but companies, governments, universities and interested parties are undertaking research to see how this can work in practical application. (And of course, IBM isn’t the only company working on this problem.)

IBM sees applications for quantum computing in areas like medicine, drug discovery and materials science as this technology advances and becomes better understood. It is also trying to anticipate possible negative consequences of an advanced technology such as the ability to eventually be able to break encryption. Gil says they are working with standards bodies to try and  develop post-quantum computing encryption algorithms, and while they are a long way from achieving that, they certainly seem to understand the magnitude of the issues and are trying to mitigate them.

Source: https://techcrunch.com/2017/11/10/ibm-passes-major-milestone-with-20-and-50-qubit-quantum-computers-as-a-service/?ncid=rss&utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+techcrunchIt+%28TechCrunch+IT%29

Powered by WPeMatico