Customers exploring blockchain for their applications and solutions typically start with a prototype or proof of concept effort with a blockchain technology before they get to build, pilot, and production rollout. During the latter stages, apart from the ease of deployment, there is an expectation of flexibility in the configuration in terms of the number of blockchain members in the consortium, size and number of nodes and ease in management post-deployment.
We are sharing the release of a new Hyperledger Fabric on Azure Kubernetes Service marketplace template in preview. Any user with minimal knowledge of Azure or Hyperledger Fabric can now set up a blockchain consortium on Azure using this solution template by providing few basic input parameters.
This template helps the customers to deploy Hyperledger Fabric (HLF) network on Azure Kubernetes Service (AKS) clusters in a modular manner, that meets the much-required customization with regard to the choice of Microsoft Azure Virtual Machine series, number of nodes, fault-tolerance, etc. Azure Kubernetes Service provides enterprise-grade security and governance, making the deployment and management of containerized application easy. Customers anticipate leveraging the native Kubernetes tools for the management plane operations of the infrastructure and call Hyperledger Fabric APIs or Hyperledger Fabric client software development kit for the data plane workflows.
The template has various configurable parameters that make it suitable for production-grade deployment of Hyperledger Fabric network components.
Top features of Hyperledger Fabric on Azure Kubernetes Service template are:
Supports deployment of Hyperledger Fabric version 1.4.4 (LTS).
Supports deployment of orderer organization and peer nodes with the option to configure the number of nodes.
Supports Fabric Certificate Authority (CA) with self-signed certificates by default, and an option to upload organization-specific root certificates to initialize the Fabric CA.
Supports running of LevelDb and CouchDB for world state database on peer nodes.
Ordering service runs highly available RAFT based consensus algorithm, with an option to choose 3,5, or 7 nodes.
Supports ways to configure in terms of number and size of the nodes of Azure Kubernetes Clusters.
Public IP exposed for each AKS cluster deployed for networking with other organizations
Enables you to jump start with building your network sample scripts to help post-deployment steps such as create workflows of consortiums and channels, adding peer nodes to the channel, etc.
Node.js application sample to support running a few native Hyperledger Fabric APIs such as new user identity generation, running custom chain code, etc.
To know more about how to get started with deploying Hyperledger Fabric network components, refer to the documentation.
What’s coming next
Microsoft Visual Studio code extension support for Azure Hyperledger Fabric instances
What more do we have for you? The template and consortium sample scripts are open-sourced in the GitHub repo, so the community can leverage to build their customized versions.
Providing our customers with choice and flexibility is central to our mission around blockchain in Azure. Today, we are pleased to introduce that we’re bringing managed Corda Enterprise to Azure Blockchain Service.
The road to Corda Enterprise on Azure as a managed service
In 2017, the relationship matured to a partnership, and in the subsequent years we’ve worked closely with customers, consortiums, and independent software vendors (ISVs) to help them bring Corda-based solutions to Azure. Working together with our customers and partners, we’ve seen the launch of multiple Corda consortiums on Azure, from Insurwave’s launch in 2018 to the recent September 2019 announcement of TradeIX’s launch of the Marco Polo Network on Azure.
As customers were building end to end solutions, one of the big requests was to make integrating Corda with enterprise data, systems, and Software as a Service (SaaS) easier. Earlier this year, we released the Corda Logic App and Flow Connectors that brought 30 years of Microsoft enterprise integration experience to Corda. With Flow and PowerApps, it also became possible for citizen developers to build low-code or no-code web and mobile apps for Corda.
However, the biggest request we had from customers was for Corda to be released as a managed service in Azure. Specifically, a Platform as a Service (PaaS) offering that would set up Corda nodes to connect with the appropriate Corda network, manage node health, and update both the nodes and the underlying software.
Today at CordaCon, we’re pleased to share that customers can now sign up for the preview of Corda Enterprise on Azure Blockchain Service.
Simple Corda node deployment
Corda on Azure Blockchain Service provides you with the ability to choose where to provision and host nodes, either on the Corda Network (Livenet, Testnet, UAT) or a private Corda network.
For the preview, Azure Blockchain Service supports the latest Corda Enterprise version (currently at 4.X). In addition to provisioning the node, Azure Blockchain Service automatically connects the Corda node to the appropriate network based on your Azure Blockchain Service. Being part of Azure Blockchain Service, you can configure and deploy a Corda node within the Azure portal or programmatically through REST APIs, CLI, or PowerShell. This dramatically simplifies Corda node deployment and connection.
Managed Corda nodes and Corda Distributed Applications
In addition to provisioning and deploying Corda nodes, Azure Blockchain Service provides managed APIs to help you manage your Corda nodes and Corda Distributed Applications (CorDapps). With Corda node management, you’ll be able to control access to your node, scale the node up or down, and drive flow draining. With CorDapp management, you’ll be able to easily add, manage, and version your CorDapps on your node.
Integrated node and CorDapp health, monitoring, and logging
Corda on Azure Blockchain Service leverages Azure Monitor making it easier to access Corda node and CorDapp health, monitoring, and logging information. With Azure Monitor, you’re able to customize alerts and actions based on logs and events. With all Corda and CorDapp logs at your fingertips, you’re able to create custom visualizations and dashboards based on the health and monitoring data.
If you are building a solution on Corda Enterprise and are interested in joining the preview, please fill out the following form.
For those of you at CordaCon this week who would like to learn more, please come visit us at our booth or attend our Fully Managed Corda Enterprise with Azure Blockchain Service session on October 24th to speak with members of the Azure Blockchain team.
Crypto can’t scale because of consensus … yet Amazon DynamoDB does over 45 Million TPS
The metrics point to crypto still being a toy until it can achieve real world business scale demonstrated by Amazon DynamoDB
14 transactions per second. No matter how passionate you may be about the aspirations and future of crypto, it’s the metric that points out that when it comes to actual utility, crypto is still mostly a toy.
After all, pretty much any real world problem, including payments, e-commerce, remote telemetry, business process workflows, supply chain and transport logistics, and others require many, many times this bandwidth to handle their current business data needs — let alone future ones.
Unfortunately the crypto world’s current solutions to this problem tend to either blunt the advantages of decentralization (hello, sidechains!) or look like clumsy bolt-ons that don’t close the necessary gaps.
Real World Business Scale
Just how big is this gap, and what would success look like for crypto scalability? We can see an actual example of both real-world transaction scale and what it would take to enable migrating actual business processes to a new database technology by taking a look at Amazon’s 2019 Prime Day stats.
The AWS web site breaks down Amazon retail’s adoption and usage of NoSQL (in the form of DynamoDB) nicely:
Amazon DynamoDB supports multiple high-traffic sites and systems including Alexa, the Amazon.com sites, and all 442 Amazon fulfillment centers. Across the 48 hours of Prime Day, these sources made 7.11 trillion calls to the DynamoDB API, peaking at 45.4 million requests per second.
45 million requests per second. That’s six zeros more than Bitcoin or Eth. Yikes. And this is just one company’s traffic, and only a subset at that. (After all, Amazon is a heavy user of SQL databases as well as DynamoDB), so the actual TPS DynamoDB is doing at peak is even higher than the number above.
Talk about having a gap to goal…and it doesn’t stop there. If you imagine using a blockchain (with or without crypto) for a real-world e-commerce application and expect it support multiple companies in a multi-tenanted fashion, want it to replace legacy database systems, and need a little headroom to grow — a sane target might look like 140 million transactions per second.
That’s seven orders of magnitude from where we are today.
The Myth of Centralization
Why are these results so different? Let’s examine this dichotomy a little closer. First, note that DynamoDB creates a fully ordered ledger, known as a stream, for each table. Each table is totally ordered and immutable; once emitted, it never changes.
DynamoDB is doing its job by using a whole lot of individual servers communicating over a network to form a distributed algorithm that has a consensus algorithm at its heart.
Cross-table updates are given ACID properties through a transactional API. DynamoDB’s servers don’t “just trust” the network (or other parts of itself), either — data in transit and at rest is encrypted with modern cryptographic protocols and other machines (or the services running on them) are required to sign and authenticate themselves when they converse.
Any of this sound familiar?
The classic, albeit defensive, retort to this observation is, “Well, sure, but that’s a centralized database, and decentralized data is so much harder that is just has to be slower.” This defense sounds sort of plausible on the surface, but it doesn’t survive closer inspection.
First, let’s talk about centralization. A database running in single tenant mode with no security or isolation can be very fast indeed — think Redis or a hashtable in RAM, either of which can achieve bandwidth numbers like the DynamoDB rates quoted above. But that’s not even remotely a valid model for how a retail giant like Amazon uses DynamoDB.
Different teams within Amazon (credit card processing, catalog management, search, website, etc.) do not get to read and write each others’ data directly — these teams essentially assume they are mutually untrustworthy as a defensive measure. In other words, they make a similar assumption that a cryptocurrency blockchain node makes about other nodes in its network!
On the other side, DynamoDB supports millions of customer accounts. It has to assume that any one of them can be an evildoer and that it has to protect itself from customers and customers from each other. Amazon retail usage gets exactly the same treatment any other customer would…no more or less privileged than any other DynamoDB user.
Again, this sounds pretty familiar if you’re trying to handle money movement on a blockchain: You can’t trust other clients or other nodes.
These business-level assumptions are too similar to explain a 7 order of magnitude difference in performance. We’ll need to look elsewhere for an explanation.
Is it under the hood?
Now let’s look at the technology…maybe the answer is there. “Consensus” often gets thrown up as the reason blockchain bandwidth is so low. While DynamoDB tables are independent outside of transaction boundaries, it’s pretty clear that there’s a lot of consensus, in the form of totally ordered updates, many of which represent financial transactions of some flavor in those Prime Day stats.
Both blockchains and highly distributed databases like DynamoDB need to worry about fault tolerance and data durability, so they both need a voting mechanism.
Here’s one place where blockchains do have it a little harder: Overcoming Byzantine attacks requires a larger majority (2/3 +1) than simply establishing a quorum (1/2 +1) on a data read or write operation. But the math doesn’t hold up: At best, that accounts for 1/6th of the difference in bandwidth between the two systems, not 7 orders of magnitude.
What about Proof of Work? Ethereum, Bitcoin and other PoW-based blockchains intentionally slow down transactions in order to be Sybil resistant. But if that were the only issue, PoS blockchains would be demonstrating results similar to DynamoDB’s performance…and so far, they’re still not in the ballpark. Chalk PoW-versus-PoS up to a couple orders of magnitude, though — it’s at least germane as a difference.
How about the network? One difference between two nodes that run on the open Internet and a constellation of servers in (e.g.) AWS EC2 is that the latter run on a proprietary network. Intra-region, and especially intra-Availability Zone (“AZ”) traffic can easily be an order of magnitude higher bandwidth and an order of magnitude lower latency than open Internet-routed traffic, even within a city-sized locale.
But given that most production blockchain nodes at companies like Coinbase are running in AWS data centers, this also can’t explain the differences in performance. At best, it’s an indication that routing in blockchains needs more work…and still leaves 3 more orders of magnitude unaccounted for.
What about the application itself? Since the Amazon retail results are for multiple teams using different tables, there’s essentially a bunch of implicit sharding going on at the application level: Two teams with unrelated applications can use two separate tables, and neither DynamoDB nor these two users will need to order their respective data writes. Is this a possible semantic difference?
For a company like Amazon retail, the teams using DynamoDB “know” when to couple their tables (through use of the transaction API) and when to keep them separate. If a cryptocurrency API requires the blockchain to determine on the fly whether (and how) to shard by looking at every single incoming transaction, then there’s obviously more central coordination required. (Oh, the irony.)
But given that we have a published proof point here that a large company obviously will perform application level sharding through its schema design and API usage, it seems clear that this is a spurious difference — at best, it indicates an impoverished API or data model on the part of crypto, not an a priori requirement that a blockchain has to be slow in practice.
In fact, we have an indication that this dichotomy is something crypto clients are happy to code to: smart contracts. They’re both 1) distinguished in the API from “normal” (simple transfer) transactions and 2) tend to denote their participants in some fashion.
It’s easy to see the similarity between smart contract calls in a decentralized blockchain and use of the DynamoDB transaction API between teams in a large “centralized” company like Amazon retail. Let’s assume this accounts for an order of magnitude; 2 more to go.
Managed Services and Cloud Optimization
One significant difference in the coding practices of a service like DynamoDB versus pretty much any cryptocurrency is that the former is highly optimized for running in the cloud.
In fact, you’d be hard pressed to locate a line of code in DynamoDB’s implementation that hasn’t been repeatedly scrutinized to see if there’s a way to wring more performance out of it by thinking hard about how and where it runs. Contrast this to crypto implementations, which practically make it a precept to assume the cloud doesn’t exist.
Instance selection, zonal placement, traffic routing, scaling and workload distribution…most of the practical knowledge, operational hygiene, and design methodology learned and practiced over the last decade goes unused in crypto. It’s not hard to imagine that accounts for the remaining gap.
Getting Schooled on Scalability
Are there design patterns we can glean from a successfully scaled distributed system like DynamoDB as we contemplate next-generation cryptocurrency blockchain architectures?
We can certainly “reverse engineer” some requirements by looking at how a commercially viable solution like Amazon’s Prime Day works today:
Application layer (client-provided) sharding is a hard requirement. This might take a more contract-centric form in a blockchain than in a NoSQL database’s API, but it’s still critical to involve the application in deciding which transactions require total ordering versus partial ordering versus no ordering. Partial ordering via client-provided grouping of transactions in particular is virtually certain to be part of any feasible solution.
Quorum voting may indeed be a bottleneck on performance, but Byzantine resistance per se is a red herring. Establishing a majority vote on data durability across mutually authenticated storage servers with full encoding on the wire isn’t much different from a Proof-of-Stake supermajority vote in a blockchain. So while it matters to “sweat the details” on getting this inner loop efficient, it can’t be the case that consensus per se fundamentally forces blockchains to be slow.
Routing matters. Routing alone won’t speed up a blockchain by 7 orders of magnitude, but smarter routing might shave off a factor of 10.
Infrastructure ignorance comes at a cost. Cryptocurrency developers largely ignore the fact that the cloud exists (certainly that managed services, the most modern incarnation of the cloud, exist). This is surprising, given that the vast majority of cryptocurrency nodes run in the cloud anyway, and it almost certainly accounts for at least some of the large differential in performance. In a system like DynamoDB you can count on the fact that every line of code has been optimized to run well in the cloud. Amazon retail is also a large user of serverless approaches in general, including DynamoDB, AWS Lambda, and other modern cloud services that wring performance and cost savings out of every transaction.
We’re not going to solve blockchain scaling in a single article 😀, but there’s a lot we can learn by taking a non-defensive look at the problem and comparing it to the best known distributed algorithms in use by commercial companies today.
Only by being willing to learn and adapt ideas from related areas and applications can blockchains and cryptocurrencies grow into the lofty expectations that have been set for them…and claim a meaningful place in scaling up to handle real-world business transactions.
A few months back, at SAP’s SAPPHIRE NOW event, we announced the availability of Azure Mv2 Virtual Machines (VMs) with up to 6 TB of memory for SAP HANA. We also reiterated our commitment to making Microsoft Azure the best cloud for SAP HANA. I’m glad to share that Azure Mv2 VMs with 12 TB of memory will become generally available and production certified in the coming weeks, in US West 2, US East, US East 2, Europe North, Europe West and Southeast Asia regions. In addition, over the last few months, we have expanded regional availability for M-series VMs, offering up to 4 TB, in Brazil, France, Germany, South Africa and Switzerland. Today, SAP HANA certified VMs are available in 34 Azure regions, enabling customers to seamlessly address global growth, run SAP applications closer to their customers and meet local regulatory needs.
Learn how you can leverage Azure Mv2 VMs for SAP HANA by watching this video.
Running mission critical SAP applications requires continuous monitoring to ensure system performance and availability. Today, we are launching private preview of Azure Monitor for SAP Solutions, an Azure Marketplace offering that monitors SAP HANA infrastructure through the Azure Portal. Customers can combine monitoring data from the Azure Monitor for SAP Solutions with existing Azure Monitor data and create a unified dashboard for all their Azure infrastructure telemetry. You can sign up by contacting your Microsoft account team.
We continue to co-innovate with SAP to help accelerate our customers’ digital transformation journey. At SAPPHIRE NOW, we announced several such co-innovations with SAP. First, we announced general availability of SAP Data Custodian, a governance, risk and compliance offering from SAP, which leverages Azure’s deep investments in security and compliance features such as Customer Lockbox.
Second, we announced general availability of Azure IoT integration with SAP Leonardo IoT, offering customers the ability to contextualize and enrich their IoT data with SAP business data to drive new business outcomes. Third, we shared that SAP’s Data Intelligence solution leverages Azure Cognitive Services Containers to offer intelligence services such as face, speech, and text recognition. Lastly, we announced a joint collaboration of the integration of Azure Active Directory with SAP Cloud Platform Identity Authentication Service (SAP IAS) for a seamless single sign on and user provisioning experience across SAP and non-SAP applications. Azure AD Integration with SAP IAS for seamless SSO is generally available and the user provisioning integration is now in public preview. Azure AD integration with SAP SuccessFactors for simplified user provisioning will become available soon.
Another place I am excited to deepen our partnership is in blockchain. SAP has long been an industry leader in solutions for supply chain, logistics, and life sciences. These industries are digitally transforming with the help of blockchain, which adds trust and transparency to these applications, and enables large consortiums to transact in a trusted manner. Today, I am excited to announce that SAP’s blockchain-integrated application portfolio will be able to connect to Azure blockchain service. This will enable our joint customers to bring the trust and transparency of blockchain to important business processes like material traceability, fraud prevention, and collaboration in life sciences.
Together with SAP, we are offering a trusted path to digital transformation with our best in class SAP certified infrastructure, business process and application innovation services, and a seamless set of offerings. As a result, we help migrate to Azure SAP customers across the globe such as Carlsberg and CONA Services, who have large scale mission critical SAP applications. Here are a few additional customers benefiting from migrating their SAP applications to Azure:
Al Jomaih and Shell Lubricating Oil Company: JOSLOC, the joint venture between Al Jomaih Holding and Shell Lubricating Oil Company, migrated their mission critical SAP ERP to Azure, offering them enhanced business continuity and reduced IT complexity and effort, while saving costs. Migrating SAP to Azure has enabled the joint venture to prepare for their upgrade to SAP S/4HANA in 2020.
TraXall France: TraXall France provides vehicle fleet management services for upwards of 40,000 managed vehicles. TraXall chose Microsoft Azure to run their SAP S/4HANA due to the simplified infrastructure management and business agility, and to meet compliance requirements such as GDPR.
Zuellig Pharma: Amid a five-year modernization initiative, Singapore-based Zuellig Pharma wanted to migrate their SAP solution from IBM DB2 to SAP HANA. Zuellig Pharma now runs its SAP ERP on HANA with 1 million daily transactions and 12 TB of production workloads at a 40 percent savings compared to their previous hosting provider.
If you’re attending SAP TechEd in Las Vegas, stop by at the Microsoft booth #601 or attend one of the Microsoft Azure sessions to learn more about these announcements and to see these product offerings in action.
The Marco Polo Network is now generally available on Azure to help both trade banks and corporations take advantage the R3 Corda distributed ledger to better facilitate global trade in this ever-changing world. Regardless of what headlines will lead you to believe, international trade is the lifeblood of the modern global economy. Each year, hundreds of trillions of dollars in goods, assets, credit, and money change hands to keep the engine of global trade running. When a multinational corporation (acting as a seller or exporter) sends goods to their customers (acting as buyers or importers,) the corporation often doesn’t receive payment for 30-90 days. This problem can be exacerbated by variables such as tariffs or new customs duties. To manage cash flow while waiting for payment, sellers often resort to taking out short-term loans from trade banks. But trade banks find it difficult to keep pace having to rely on aging systems and siloed data that increases cost and process friction for all involved.
The disadvantages of disconnected trade
If global trade is an engine, financing is the fuel. But many trade banks rely on decades-old, paper-based processes that slow trade flow and add complexity, with antiquated financing tools that make onboarding expensive, reconciliation cumbersome, and the customer experience poor.
Furthermore, as global providers of trade and supply chain finance, trade banks must manage transactions between sellers and buyers while navigating increasingly complex regulatory processes pronounced by national boundaries. Due to these global regulations, banks can be forced to use different financing platforms for each geolocation, leading to an overabundance of disconnected management tools.
Without a network to exchange data and a platform for viewing and managing transactions, banks have tremendous difficulty processing and executing their clients trade and supply chain financing transactions. At the same time, buyers and sellers can lack awareness of their own financial health due to paper-based trade contracts which aren’t immediately understood across the organization. Furthermore, many small- and medium-sized import and export businesses are unable to scale due to staggering overhead costs.
A cloud-based network to streamline global trade
To improve efficiency in global trade finance, technology firms Trade IX and R3 partnered together with leading banks to create the Marco Polo Network. Launched in 2017, Marco Polo provides a digital, distributed technology platform that allows trading parties to automate and streamline their trade and supply chain finance activities. Applications are built and deployed on top of the platform that allow banks and corporations to perform specific product and trade orchestrations. Trading parties – buyers, sellers, logistics providers, insurers, banks, and other key stakeholders- are able to exchange trade data and assets securely, in real time, and peer to peer using an open and distributed network powered by Corda. Importantly, the network and platform are open – meaning third-parties can build, develop, and deploy their own solutions on the network and platform.
The Marco Polo Network, a platform built by TradeIX using 18 distinct Azure services and R3’s Corda distributed ledger technology, is revolutionizing trade finance. TradeIX packaged Corda and the Marco Polo Network application stack, or node, for deployment using Azure Container Instances and the Azure Container Registry. This gave participating banks and corporations the flexibility to pursue one of two different hosting options; run a Marco Polo node inside of the TradeIX Azure tenant or pull down the application binaries as Docker images from an Azure Container Registry where they could then be deployed within the bank’s Azure tenant. The result is a transformational technology and distributed platform that enables the world’s leading trade banks and their corporate clients to exchange data in real-time resulting in streamlined, automated business activities that increase efficiency and transparency for receivables financing and cash flow management. TradeIX built these exciting new collaboration capabilities into the Marco Polo Network using an innovative, integrated application stack comprised of Corda, Azure SQL Server, CosmosDB and Microsoft Dynamics 365 technologies.
One of the more novel features of the Marco Polo Network is the use of the R3 Corda distributed ledger to ensure that all of the counterparties involved in a financing request have a secure medium by which they can securely and seamlessly exchange trade data, contracts, and financial assets that are critical to completing a supply chain finance transaction. By hosting this platform in the cloud, TradeIX delivers an improved customer experience by providing a single infrastructure for banks and clients to manage their transactions—regardless of geolocation, currency, type of transaction, and industry. Because it’s an open, cloud-native network, Marco Polo Network members can share best practices, run pilot programs, and adjust the platform to meet their specific needs. However, this openness should not come at the expense of the security and compliance fundamentals required by the world’s leading banks and corporations. Microsoft and TradeIX implemented a host of Azure security controls such as Log Analytics, Security Center, Application Gateway, and DDOS Protection to ensure that the Marco Polo Network would be well-positioned to maintain the highest levels of trust, transparency, standards conformance for all members across the network.
In the near future, the Marco Polo Network will also provide corporate treasurers with an ERP-embedded Marco Polo App supported by Dynamics 365, that allows companies to manage their trade finance directly within their own ERP system. The TradeIX – Dynamics 365 interface enables corporations to submit requests for finance directly to their trade bank of choice where it will be automatically acknowledged, received, and processed by the bank’s Corda instance resulting in a free exchange of data without the need for manual reconciliation.
Reducing expenses, improving revenue
An important objective of the Marco Polo Network is to obtain all trade data necessary for a transaction as directly as possible, from the original data source. This also includes external third parties such as logistic providers. Imagine a scenario where two companies (a buyer and a seller) and their corresponding banks, exchange order and delivery data via the Marco Polo Network. Payment terms would then be secured by an irrevocable payment commitment, triggered through automated matching of trade data. This would then be followed by an automatic matching of trade data achieved with involvement of the executing logistics provider, which enters the relevant transport details directly into the network. The ability for the third-party logistics provider to automatically trigger a payment from buyer to supplier following goods delivery with data reconciliation flowing across multiple banks simultaneously demonstrates the real-world value of the Marco Polo Network.
A growing network, built with business in mind
Because the Marco Polo Network is governed by member banks, the model promotes an atmosphere of collaboration across the global trade industry. This formalized governance framework has helped the Marco Polo Network onboard trade banks and corporations across Africa, Asia, Europe, the Middle East, as well as North and South America. Companies of all sizes will benefit from better visibility into trading relationships and easier access to financing options, beyond point to point relationships, to a global network of trading parties.
“I’m very pleased to see Microsoft’s Azure team is pushing the boundaries of banking and technology innovation with their partnership with the Marco Polo Network built by TradeIX. These 2 solutions coupled with Corda creates a very compelling and modern proposition for any smart business looking to take advantage of the benefits that distributed architecture offers.” – Andrew Speers, Director, Product and Innovation at NatWest and Board Director at the Corda Network Foundation.
“International trade is indeed the lifeblood of the economy, which is why R3 is so proud to be a part of the Marco Polo Network. Together, Corda and Microsoft Azure are enabling TradeIX’s mission to transform trade finance, by bringing much needed efficiencies to this market, which holds hidden treasure in the hunt for high yields. We are honored to be part of the ecosystem that will build trade finance solutions on blockchain, and are excited to see what’s next” – Ricardo Correia, Head of Partners at R3.
“It is exciting to be part of the growing ecosystem building trade finance solutions on blockchain. Microsoft is honoured to be providing our global scale cloud as a foundation to R3 and TradeIX to speed this solution to market,” – Michael Glaros, Azure Blockchain Engineering, Microsoft.
“One of the founding technology decisions that were made for the Marco Polo Network was to use the infrastructure provided by Microsoft Azure. We firmly believe that our partnership with Microsoft provides Marco Polo members with the best infrastructure and highest security and transparency standards combined with improved customer experience.” Oliver Belin CMO, TradeIX.
After first emerging as the basis for the Bitcoin protocol, blockchain has since gained momentum as a way to digitize business processes that extend beyond the boundaries of a single organization. While digital currencies use the shared ledger to track transactions and balances, enterprises are coming together to use the ledger in a different way. Smart contracts—codified versions of paper based agreements—enable multiple organizations to agree on terms that must be met for a transaction to be considered valid, empowering automated verification and workflows on the blockchain.
These digitized business processes, governed by smart contracts and powered by the immutability of blockchain, are poised to deliver the scalable trust today’s enterprises need. One Microsoft partner, SIMBA Chain, has created an offering that reduces the effort and time to start creating solutions using blockchain technology.
The Azure platform offers a wealth of services for partners to enhance, extend, and build industry solutions. Here we describe how one Microsoft partner uses Azure to solve a unique problem.
Simplifying blockchain app development
SIMBA stands for SIMpler Blockchain Applications. SIMBA Chain is a cloud based, Smart Contract as a Service (SCaaS) platform, enabling users with a variety of skill sets to build decentralized applications (dApps) and deploy to either iOS or Android.
The figure below shows the platform and the components (such as the Django web framework) used to communicate to a dApp using a pub/sub model. SIMBA Chain auto-generates the smart contract and API keys for deployment, and the app can be deployed to a number of backends for mobile apps (such as Android and iOS.) Communication to participate in the blockchain occurs through an API generated from a smart contract.
With this platform, anyone with a powerful idea can build a decentralized application. SIMBA Chain supports Ethereum and will add more blockchain protocols to their platform.
A time-saving technology
SIMBA Chain’s user-friendly interface greatly reduces the time and custom code generation required to build and deploy a blockchain-based application. Users can create and model a business application, define the assets along with the smart contracts parameters, and in a few simple clicks the SIMBA platform generates an API which interfaces with the ledger. By reducing application development time, SIMBA enables faster prototyping, refinement, and deployment.
Recommended next steps
Go to the Azure Marketplace listing for SIMBA Chain and click Get It Now.
As digital transformation expands beyond the walls of one company and into processes shared across organizations, businesses are looking to blockchain as a way to share workflow data and logic.
This spring we introduced Azure Blockchain Service, a fully-managed blockchain service that simplifies the formation, management, and governance of consortium blockchain networks. With a few simple clicks, users can create and deploy a permissioned blockchain network and manage consortium membership using an intuitive interface in the Azure portal.
To help developers building applications on the service, we also introduced our Azure Blockchain development kit for Ethereum. Delivered via Visual Studio Code, the dev kit runs on all major operating systems, and brings together the best of Microsoft and open source blockchain tooling, including deep integration with leading OSS tools from Truffle. These integrations enable developers to create, compile, test, and manage smart contract code before deploying it to a managed network in Azure.
We’re constantly looking and listening to feedback for areas where we can lean in and help developers go further, faster. This week for TruffleCon, we’re releasing some exciting new features that make it easier than ever to build blockchain applications:
Interactive debugger: Debugging of Ethereum smart contracts, has been so far, a challenging effort. While there are some great command line tools (e.g., Truffle Debugger), these tools aren’t integrated into integrated development environments (IDE) like Visual Studio Code. Native integration of the Truffle Debugger into Visual Studio Code brings all the standard debugging features developers have come to rely on (e.g, breakpoints, step in/over/out, call stacks, watch windows, and Intellisense pop ups) that let developers quickly identify, debug, and resolve issues.
Auto-generated prototype UI: The dev kit now generates a UI that is rendered and activated inside of Visual Studio Code. This allows developers to interact with their deployed contracts, directly in the IDE environment without having to build other UI or custom software simply to test out basic functionality of their contracts. Having a simple, graphical user interface (GUI) driven interface that allows developers to interact and test out basic functionality of their contracts inside the IDE, without writing code, is a huge improvement in productivity.
With the addition of these new debugger capabilities, we are bringing all the major components of software development, including build, debug, test, and deploy, for Smart Contracts into the popular Visual Studio Code developer environment.
If you’re in Redmond, Washington this weekend, August 2-4, 2019, come by TruffleCon to meet the team or head to the Visual Studio Marketplace to try these new features today!
In a rapidly globalizing digital world, business processes touch multiple organizations and great sums are spent managing workflows that cross trust boundaries. As digital transformation expands beyond the walls of one company and into processes shared with suppliers, partners, and customers, the importance of trust grows with it. Microsoft’s goal is to help companies thrive in this new era of secure multi-party computation by delivering open, scalable platforms, and services that any company from game publishers and grain processors, to payments ISVs and global shippers can use to digitally transform the processes they share with others.
Azure Blockchain Service: The foundation for blockchain applications in the cloud
Azure Blockchain Service is a fully-managed blockchain service that simplifies the formation, management, and governance of consortium blockchain networks so businesses can focus on workflow logic and application development. Today, we’re excited to announce that the public preview is now available.
With a few simple clicks, users can create and deploy a permissioned blockchain network and manage consortium policies using an intuitive interface in the Azure portal. Built-in governance enables developers to add new members, set permissions, monitor network health and activity, and execute governed, private interactions through integrations with Azure Active Directory.
This week, we also announced an exciting partnership with J.P. Morgan to make Quorum the first ledger available in Azure Blockchain Service. Because it’s built on the popular Ethereum protocol, which has the world’s largest blockchain developer community, Quorum is a natural choice. It integrates with a rich set of open-source tools while also supporting confidential transactions, something our enterprise customers require. Quorum customers like Starbucks, Louis Vuitton, and our own Xbox Finance team can now use Azure Blockchain Service to quickly expand their networks with lower costs, shifting their focus from infrastructure management to application development and business logic.
“We are incredibly proud of the success Quorum has had over the last four years as organizations around the world use Quorum to solve complex business and societal problems. We are delighted to partner alongside Microsoft as we continue to strengthen Quorum and expand capabilities and services on the platform.”
— Umar Farooq, Global Head of Blockchain at J.P. Morgan
We’re excited to offer customers an enterprise-grade Ethereum stack with Quorum, and look forward to adding new capabilities to Azure Blockchain Service in the coming months, including digital token management, improved application integration, and support for R3’s Corda Enterprise.
An application-driven approach
The ledger is just the foundation for new applications. After configuring the underlying blockchain network with Azure Blockchain Service, you need to codify your business logic using smart contracts. Until now, this has been cumbersome, requiring multiple command-line tools and limited developer IDE integration. Today we are releasing an extension for VS Code to address these issues. This extension allows you to create and compile Ethereum smart contracts, deploy them to either the public chain or a consortium network in Azure Blockchain Service, and manage their code using Azure DevOps.
Once your network is created and smart contract state machines are deployed, you must build an application in order for consortium participants to share business logic and data represented by the smart contracts. A key challenge has been integrating these applications with smart contracts so they either respond to smart contract updates or execute smart contract transactions. This connects business processes managed in other systems such as databases, CRM, and ERP systems with the ledger. Our new Azure Blockchain Dev Kit makes this easier than ever with connectors and templates for Logic Apps and Flow as well as integrations with serverless tools like Azure Functions.
You can learn more about how to build your first network, code your smart contracts, and interact with the ledger in the latest episodes of the web series Block Talk.
Embracing open communities
Over the past year, we have been preparing our Confidential Consortium Framework (CCF) for public release. CCF uses trusted execution environments (TEEs) such as SGX and VSM to enable ledgers that integrate with it to execute confidential transactions with the throughput and latency of a centralized database. Confidentiality and high performance are key requirements of our enterprise customers. We’re excited to announce that we have finished the first version of CCF, integrated with Quorum, and have made the source code available on Github.
Microsoft believes that the best way to bring blockchain to our customers is by partnering with the diverse and talented open source communities that are driving blockchain innovation today. We began this journey in 2015, partnering with the growing communities around Ethereum, R3 Corda, and Hyperledger to make those technologies available in Azure. Instead of building our own ledger, or creating a ledger alternative, we have worked to make open source technology developers love and work better with Azure. All of the tooling released this week allows developers to work against both consortium networks in Azure Blockchain Service and with public Ethereum.
“Microsoft has embraced the open community of blockchain developers and has brought the best of their cloud development tooling to the developers building the next wave of decentralized applications. With Azure Blockchain Service and Ethereum integrations for tools like VS Code, Microsoft is demonstrating its commitment to open blockchain development.”
— Vitalik Buterin, co-founder of Ethereum
Learn more about Azure Blockchain Service and get started today:
The initiative brings together some of the most important blockchain platforms from the Ethereum ecosystem, Hyperledger and IBM, Intel, R3, and Digital Asset in a joint effort to establish a common taxonomy for tokens. Also joining are other standards bodies like FINRA, enterprises like J.P. Morgan, Banco Santander, and ING and companies pushing the boundaries in blockchain like ConsenSys, Clearmatics, Komgo, Web3 Labs, and others.
Over the past year, the Azure Blockchain engineering team has been working to understand the breadth of token use cases and found that a lack of industry standards was driving confusion amongst our enterprise customers and partners. We started building the Token Taxonomy Framework to help address this confusion, establish a base line understanding, and a path forward for our customers and partners to begin exploring use of tokens. We quickly realized that our efforts would be much more effective if we didn’t work in isolation, so we chose to contribute the framework and partner with our counterparts across the industry to expand the TTF and seed the industry with a common standard. As the Principal Architect for Azure Blockchain and an EEA Board Member, I will represent Microsoft in the release of the 1.0 framework and will act as the chair of the TTI, collaborating with all participants to ensure that the outcome establishes a foundation to rapidly accelerate the token economy.
A core principle driving this initiative is platform neutrality, which will ensure the standards we outline are agnostic to any company and empower the industry to innovate openly. This workgroup brings together a diverse set of thought leaders from across the blockchain community, including public cloud platforms, blockchain start-ups, and early adopters across industries. Each of us recognize both the power of the token economy and the challenges facing businesses looking to innovate in this nascent space. We hope that our initial work seeding the Token Taxonomy Initiative will provide a starting point for the community to build upon in the coming month by providing:
A definition of tokens and their use cases across industries.
A common set of concepts and terms that can be used by business, technical, and regulatory participants so they can speak the same language.
A composition framework for defining and building tokens.
Create a Token Classification Hierarchy (TCH) that is simple to understand.
Tooling meta-data using the TTF syntax to be able to generate visual representations of classifications and modeling tools to view and create token definitions mapped to the taxonomy eventually linking with implementations for specific platforms.
A sandbox environment for legal and regulatory requirement discovery and input.
While not specific to the Ethereum family of technologies, this work does draw from the working group’s experience building with the Ethereum ecosystem. As chair of the TTI, I invite everyone to participate and learn about the taxonomy as it is rolled out in the coming months and look forward to the continued innovation in this space.