IBM releases data science and machine learning platform Cloud Private for Data

IBM is embracing artificial intelligence with the launch of IBM Cloud Private for Data. The platform consists of integrated data science, data engineering and app building services. According to IBM, it is designed to help organizations accelerate their AI journeys and increase productivity.


“Whether they are aware of it or not, every company is on a journey to AI as the ultimate driver of business transformation,” said Rob Thomas, general manager of IBM Analytics. “But for them to get there, they need to put in place an information architecture for collecting, managing and analyzing their data. With today’s announcements, we are planning to bring the AI destination closer and give access to powerful machine learning and data science technologies that can turn data into game-changing insight.”

The platform is powered by an in-memory database that is capable of ingesting and analyzing one million events per second, according to their internal testing. In addition, it is deployed on Kubernetes, allowing for a fully integrated development and data science environment, IBM explained. The company hopes it will provide organizations with access to data insights that were previously unobtainable, and allow users to exploit event-driven applications to gather and analyze data from IoT sensors, online commerce, mobile devices, and more.

READ ASLO: IBM releases WebSphere Liberty code to open source


Cloud Private for Data includes capabilities from IBM’s Data Science Experience, Information Analyzer, Information Governance Catalogue, Data Stage, Db2, and Db2 Warehouse. These capabilities will allow customers to gain insights from data stored in protected environments and make data-driven decisions. According to the company, the solution is meant to provide a data infrastructure layer for AI behind firewalls.

Going forward, IBM plans to have Cloud Private for Data run on all cloud and be available in industry-specific solutions for areas such as financial services, healthcare, and manufacturing.

As part of the launch, the company also announced the Data Science Elite Team, a no-charge consultancy team that will advise clients on machine learning adoption and assist them with their AI roadmaps.

IBM releases WebSphere Liberty code to open source

IBM on the 20th moved the code that underlies its WebSphere Liberty solution for development using Agile and DevOps methodologies to GitHub, where it will be available this week under the Eclipse Public License v1.

The Open Liberty project is working to create a new runtime for Java microservices that can be moved between different cloud environments, according to Ian Robinson, an IBM distinguished engineer and the chief architect of WebSphere. Open Liberty will be the basis the IBM’s continued development of its Liberty product – the codebase is the same — and will be fully supported in commercial WebSphere licenses. It can be downloaded at

The Open Liberty code on GitHub will give developments the components they need to create Java applications and microservices, using the Java EE foundation from WebSphere Liberty and the work from the Eclipse MicroProfile community. MicroProfile defines common APIs and infrastructure to microservices applications can be created and deployed without vendor lock-in, Robinson wrote in his blog.

Along with being a founding member of the Eclipse MicroProfile project, IBM has collaborated with Google and Lyft on the Istio project to create an open service fabric for microservices integration and management, and would like to see MicroProfile integrate with Istio, Robinson said.

Further, IBM’s commitment to open source includes the contribution of IBM’s Java 9 VM to Eclipse as Eclipse OpenJ9, which – when combined with Open Liberty, Eclipse MicroProfile and Java EE at Eclipse – creates a fully open licensing model of a full Java stack for building, testing, running and scaling Java applications.

“We hope Open Liberty will help more developers turn their ideas into full-fledged, enterprise ready apps,” Robinson wrote in his blog. “We also hope it will broaden the WebSphere family to include more ideas and innovations to benefit the broader Java community of developers at organizations big and small.”

Facebook, IBM, Microsoft lead advances in AI

The MIT-IBM Watson AI Lab is focused on fundamental artificial intelligence (AI) research with the goal of propelling scientific breakthroughs that unlock the potential of AI.

Artificial intelligence and machine learning are playing larger roles in software, from data consumption and analysis to test automation and user experience. These cognitive services will drive the next wave of technology innovation. And industry heavyweights Facebook,  IBM and Microsoft are leading the charge with new investments for innovation.

IBM yesterday announced plans to create an AI research partnership with the Massachusetts Institute of Technology to unlock AI’s potential by advancing hardware, software and algorithms around deep learning, the company said in the announcement.

IBM will make a 10-year, $240 million commitment to the MIT-IBM Watson AI Lab, which will be located in Cambridge, Mass., where IBM has a research lab and where MT’s campus is located. Dario Gil, IBM Research VP of AI, and Dean Anantha P. Chandrakasan of MIT’s School of Engineering, will co-chair the new lab. The project will draw from the expertise of more than 100 AI scientists and MIT professors and students.

“The field of artificial intelligence has experienced incredible growth and progress over the past decade. Yet today’s AI systems, as remarkable as they are, will require new innovations to tackle increasingly difficult real-world problems to improve our work and lives,” said Dr. John Kelly III, IBM senior vice president, Cognitive Solutions and Research, in a statement. “The extremely broad and deep technical capabilities and talent at MIT and IBM are unmatched, and will lead the field of AI for at least the next decade.”

Among the efforts the lab team will pursue are creating AI algorithms that can tackle more complex problems, understanding the physics of AI, how AI applies to vertical industries, and delivering societal and economic benefits through AI.

Meanwhile, Microsoft yesterday announced the Open Neural Network Exchange in conjunction with Facebook. Microsoft’s Cognitive Toolkit, along with Caffe2 and PyTorch, will all support the open-source ONNX.


According to Microsoft’s announcement, the ONNX representation of neural networks will provide framework interoperability, allowing developers to use their preferred tools while moving between frameworks. ONNX also offers shared optimization, so organizations looking to improve the performance of their neural networks can do so to multiple frameworks at once by simply targeting the ONNX representation.

ONNX, the announcement explained, “provides a definition of an extensible computation graph model, as well as definitions of built-in operators and standard data types.” Initially, the project is focused on inferencing capabilities.

ONNX code and documentation are available on GitHub.

Digital operations management company PagerDuty is using machine learning and advanced response automation to help businesses orchestrate the correct response to any situation. Among the new capabilities in PagerDuty’s platform are the ability to group related alerts to provide context, the ability to recognize similar incidents with the context of who dealt with the similar issue in the past and what steps were taken to resolve it, the ability to design automated response patterns, and more.

“Today’s dynamic digital business climate has exponentially increased both opportunity for growth and downside risks to mitigate. The latest Digital Operations Management capabilities announced [yesterday] – machine learning and automation – tackle the real-time, all-the-time demands of consumers and business, translating complex events and signals into actionable insights, and orchestrating teams across businesses in service or revenue and productivity,” said Jennifer Tejada, CEO of PagerDuty.

Lastly, Cloudera yesterday announced the acquisition of Fast Forward Labs, an applied research and advisory services company specializing in machine learning and applied AI.

Now known as Cloudera Fast Forward Labs, the company is focused on practical research into data science, and applying that research to broad business problems.

IBM says its X-Force Red security pen-testing brand is now offering connected car and IoT sweeps.

Predictably enough, the new penetration-testing services “will be delivered alongside the Watson IoT platform”, which is just cloud-based configuration and device management for Internet of Things networks, allied to everyone’s favourite buzzphrases du jour – machine learning and artificial intelligence.

Big Blue says X-Force “worked with more than a dozen automotive manufacturers and third-party automotive suppliers to build expertise and programmatic penetration testing and consulting services”, as well as connecting its pen-testing efforts to the Watson IoT platform.

It also has a facility in Germany dedicated to “working with automotive companies to help design connected cars”, according to Eweek.

The announcement comes as X-Force celebrates its first birthday. Last year we reported how Big Blue adopted the football manager approach to talent acquisition for its then-new unit, poaching everyone else’s pen-testing brainboxes.

IBM announced Wednesday that it’s opening four new IBM Cloud data centers across the globe, bringing its total data center footprint to 59 across 19 countries.

Two of the new data centers are in London, giving IBM a total of five data centers in the UK.

An additional data center is slated to launch later this year. These new facilities are part of a plan IBM announced last year to triple its cloud data center capacity in the UK.

The third new data center now open is in San Jose, California, marking its third facility in the region and 23rd data center in the US. Lastly, IBM opened a new data center in Sydney, giving it four across Australia.

The expansion comes a day after IBM discussed its second quarter financial results, which delivered revenues below expectations.

Investors are still waiting for IBM’s investments in “strategic imperatives” such as AI and cloud computing to pay off, which should help make up for its weakening legacy businesses. SVP and CFO Martin Schroeter said Tuesday that he expects revenues from “strategic imperatives” to accelerate in the second half of 2017, in part because of new cloud contracts.

IBM’s cloud revenue in Q2 came to $3.9 billion, up 17 percent year-over-year adjusting for currency. Over the last 12 months, its cloud revenue has totaled $15.1 billion.

Schroeter cited some of IBM’s recent wins in the cloud market, such as the 10-year agreement it signed with Lloyd’s Banking Group, one of IBM’s largest cloud agreement ever.

Amazon Web Services continues to dominate the public cloud infrastructure market, while other competitors like Google and Microsoft are making aggressive efforts to catch up. Schroeter said IBM’s cloud business will benefit from its sophisticated capabilities, such as Watson-powered services.

“Enterprises will move to cognitive on the cloud with someone they trust, who has leading tools and industry expertise and a data model and business model consistent with their goals,” he said.

“That is the IBM Cloud plus Watson.”

Soe analysts, however, have questioned the value Watson can generate for shareholders, given factors like the need for consulting to implement Watson and growing competition from cloud competitors.

IBM is also stressing its focus on security in the cloud, highlighting that it is one of the first global cloud companies to adopt the EU’s Data Protection Code of Conduct for Cloud Service Providers.

Its cloud facilities “are designed to be state of the art in terms of privacy and security,” IBM boasts, promising to help European clients prepare for the implementation of the General Data Protection Regulations (GDPR).


Apple has announced plans to setup its first data center in China to comply with a new law requiring foreign companies to store Chinese users’ information in the country. The data center will be built in the southern province of Guizhou with data management firm Guizhou-Cloud Big Data Industry (GCBD) as part of a planned $1 billion investment in the province.

“The addition of this data centre will allow us to improve the speed and reliability of our products and services while also complying with newly passed regulations,” Apple said in a statement to Reuters.

Apple is the first foreign firm to announce amendments to its data storage for China after the new cybersecurity law came into effect. Other Western companies like Microsoft and IBM have run their own China-hosted cloud services for years and will need to comply with the new rules too.

Critics of the law say its strict data surveillance and storage requirements are overly vague, burdening the firms with excessive compliance risks and threatening proprietary data.

Apple was quick to point out that “No backdoors will be created into any of our systems.”

Apple has been far more profitable in China than most of its Western peers but it’s had to adapt to stronger government scrutiny. For instance, the company must undergo “security audits” on new models of the iPhone before gaining approval to sell them in China. It also saw its iBooks Store and iTunes Movies services shut down in China just six months after they were introduced.


In their latest effort to boost diversity in leadership, Intel, IBM and Pfizer have promised to invest $300m in women-led businesses over the next three years.

Of late, Intel has revealed some major spending in the advanced fields of autonomous vehicles and the internet of things (IoT), but its latest effort aims to boost the number of women in leadership positions.

Intel, IBM and Pfizer have jointly announced plans to source a total of $100m each through their supply chains from women-owned businesses and minority groups.

Announced on stage at the Global Citizen Festival in Hamburg, Germany, the trio’s promise was echoed by foundations from Norway, Belgium and the Netherlands agreeing to commit $172m towards the She Decides movement, which promotes fundamental human rights for women across the globe.

Intel’s chief diversity and inclusion officer, Barbara Whye, said this is part of the company’s wider diversity funding plans.

“Diversity and inclusion are critical underpinnings to our constantly evolving culture at Intel,” she said.

“They accelerate our ability to consistently innovate and drive the business forward. Supplier diversity adds tremendously to our competitive advantage while stimulating growth in a global marketplace.”

Back in 2015, Intel announced plans to increase spending with diverse suppliers to $1bn annually by 2020.

In other Intel news, the company revealed last month that it plans to play a big part in the next Tokyo Olympics, having partnered with the International Olympic Committee to bring IoT to one of the world’s biggest sporting events.

Running until the 2024 games in either LA or Paris, Intel’s deal will see a raft of new quirks brought to the Olympics, largely from a visual standpoint.

This will involve a VR presentation of the games, a drone light show and 360-degree replay technology.

IBM has announced that it is developing brain-inspired an AI supercomputing system. The system is quite similar to the biological brain. It has 64 million neurons and 16 billion synapses. The chief of the development project has emphasized on IBM’s production over the last six years. He also believes that IBM will soon become a leading company in the AI innovations.

Not too long ago we heard that scientists are really close to read human minds. Now they say an artificial human brain too can be designed.

IBM and the US Air Force Research Laboratory (AFRL), on 23rd of June, announced that they are developing a brain-inspired supercomputing system. The design and pattern of the system are very similar to the real brain structure. Its sensory processing power is equivalent of 64 million neurons and 16 billion synapses, while the processor component will consume the energy of merely 10 watts.

IBM TrueNorth Neurosynaptic system’s 64-chip array will power the system. IBM believes that TrueNorth’s designs will be much efficient than the systems that develop conventional chips. TrueNorth system can convert data like images and videos into symbols in real time.

The system will facilitate ‘data parallelism’ and ‘model parallelism.’ The system is devised such that if one part stops working, other will continue without any disturbance. Though not entirely a brain, it is very similar.

As the right brain is for perception and the left for symbol processing, AFRL combines these two capabilities into the conventional computer system. “AFRL was the earliest adopter of TrueNorth for converting data into decisions,” said Daniel S. Goddard, director, information Directorate, U.S. Air Force Research Lab.

This Neurosynaptic system will assist AFRL’s technological designings for Air Force because of such technical advances on its side.

Dharmendra S. Modha, chief scientist of this research, said that the advancement in IBM TrueNorth marks their possibility of leading in AI Hardware Innovation industry. He also mentioned that the company has increased the number of neurons per system from just 256 to 64 million, in last six years.

TrueNorth Neurosynaptic System was initially worked upon under the authority of Defense Advanced Research Projects Agency’S (DARPA) ‘syNAPSE’ program.

Google, IBM and Lyft are merging some of their learned best practices around microservices to create a new open-source project called Istio. Istio was developed to connect, manage and secure microservices. The goal of the project is to tackle challenges around resilience, visibility, and security.

Istio is a Layer 7 traffic monitoring and control network designed to work with Kubernetes everywhere, on premise or in the cloud. Today, developers can manually enable the alpha release of Istio on Google Container Engine. It takes a single command to install Istio on any Kubernetes cluster, which will then create a service mesh that, according to Varun Talwar, product manager at Google, is a layer between the services and the networks.

The service mesh lets developers delegate a lot of problems around visibility and security, according to Talwar. It also gives developers and teams traffic encryption, and automatic load balancing for HTTP, gRPC, and TCP traffic. With the service mesh, teams can provide centralized management regardless of the scale and velocity of their applications.

Talwar said since Google has already seen many companies rolling out their microservices on top of Kubernetes, they chose Istio to work with Kubernetes first before they start to expand to other environments, which Talwar said is to come in the near future.

“I think the reality is microservices are here to stay,” said Talwar. “People of course think it’s a great idea for agility, but if you actually want to run and operate and manage [microservices] at scale with large teams, there is a gap. And a framework like [Istio] can take care of hard concepts around resilience, visibility and security.”

Istio is the latest open source effort from Google, and overall, the company has been working on best practices around security, visibility, and management of microservices for 15 years, according to Talwar. And while Istio is just one part of the solution to make microservices easier to manage and build, Google has also been working with the community to contribute to Open Service Broker, a platform designed to simplify service delivery and consumption. According to Google, the services powered by Istio will be able to seamlessly participate in the Service Broker ecosystem.

Looking ahead, Google also wants to bring Istio capabilities to Cloud Endpoints and Apigee Edge. Once Istio becomes production-ready, the goal is to provide deeper integration with the rest of Google Cloud.

IBM announced its new Database-as-a-Service (DBaaS) toolkit on IBM Power Systems, designed to deliver speed, control and efficiency to enterprise developers and IT teams. IBM’s latest solution is focused on making popular open-source databases like MongoDB, MySQL, PostgreSQL, and others available for developers and IT teams to easily provision.

According to IBM global power growth solutions leader Chuck Bryan, application developers want to be able to develop new applications with open databases like MongoDB because it supports a variety of data sources. Developers that are writing new applications also want to be able to spin up database images quickly, and Bryan said they can do this with IBM’s new toolkit. It runs on IBM’s OpenPOWER LC servers, and it delivers a performance advantage over x86 for EDB, PostgreSQL and MongoDB.

“For the IT side of the organization, we brought all of this together to a turnkey solution so IT can get the infrastructure for deploying that DBaaS up and running very quickly. Normally it might take them weeks to months to get that environment and infrastructure put in place, and this will do it in a few days,” said Bryan.

Bryan said that today, organizations have the option of going to public cloud environments to get their open-source databases, but now they want to be able to do this inside their enterprise organizations, where they might be developing enterprise-class applications.

These applications, however, might have security issues, or data compliance issues due to the nature of their organization, and IBM’s new platform allows these organizations to have this DBaaS experience in their public cloud and on-premises data centers, said Bryan.

From an infrastructure perspective, Bryan said this release provides the server, storage, and networking infrastructure, and the platform includes an elastic cloud infrastructure for on-premises, private cloud delivery of DBaaS.

The open platform for DBaaS on IBM Power Systems also includes a self-service portal for end users and developers to deploy their choice of open-source databases like MongoDB, PostgreSQL, MySQL, MariaDB, Redis, Neo4j and Apache Cassandra. There is also a disk image builder tool for clients to build and deploy their own custom databases to the database image library.