There's a data expert making a name for himself in the corporate world today, and he's attracting a lot of attention. He's a lightning-fast learner, he speaks eight languages and he's considered an expert in multiple fields. He's got an exemplary work ethic, is a speed reader and finds insights no one else can. On a personal note, he's a mean chef and even offers good dating advice.
The name of this new paragon? Watson. IBM Watson.
Named after IBM's first CEO, Watson was born back in 2007 as part of an effort by IBM Research to develop a question-answering system that could compete on the American quiz show "Jeopardy." Since trouncing its human opponents on the show in 2011, it has expanded considerably. What started as a system focused on a single core capability -- answering questions posed by humans in natural language -- now includes dozens of services spanning language, speech, vision and data analysis.To read this article in full or to leave a comment, please click here
What goes into making a computer understand the world through senses, learning and experience, as IBM says Watson does? First and foremost, tons and tons of data.
To build a body of knowledge for Watson to work with on Jeopardy, researchers put together 200 million pages of content, both structured and unstructured, including dictionaries and encyclopedias. When asked a question, Watson initially analyzes it using more than 100 algorithms, identifying any names, dates, geographic locations or other entities. It also examines the phrase structure and the grammar of the question to better gauge what's being asked. In all, it uses millions of logic rules to determine the best answers.To read this article in full or to leave a comment, please click here
What goes into making a computer understand the world through senses, learning and experience, as IBM says Watson does? First and foremost, tons and tons of data.To build a body of knowledge for Watson to work with on Jeopardy, researchers put together 200 million pages of content, both structured and unstructured, including dictionaries and encyclopedias. When asked a question, Watson initially analyzes it using more than 100 algorithms, identifying any names, dates, geographic locations or other entities. It also examines the phrase structure and the grammar of the question to better gauge what's being asked. In all, it uses millions of logic rules to determine the best answers.To read this article in full or to leave a comment, please click here
There's a data expert making a name for himself in the corporate world today, and he's attracting a lot of attention. He's a lightning-fast learner, he speaks eight languages and he's considered an expert in multiple fields. He's got an exemplary work ethic, is a speed reader and finds insights no one else can. On a personal note, he's a mean chef and even offers good dating advice.
The name of this new paragon? Watson. IBM Watson.
Named after IBM's first CEO, Watson was born back in 2007 as part of an effort by IBM Research to develop a question-answering system that could compete on the American quiz show "Jeopardy." Since trouncing its human opponents on the show in 2011, it has expanded considerably. What started as a system focused on a single core capability -- answering questions posed by humans in natural language -- now includes dozens of services spanning language, speech, vision and data analysis.To read this article in full or to leave a comment, please click here
How did IBM's Watson get to where it is today? Here are some key events that happened along the way.
May 1997: Deep Blue conquers chess
IBM's Deep Blue computer beats world chess champion Garry Kasparov in a six-game match that lasts several days and receives massive media coverage around the world. It also inspires researchers at IBM to undertake an even bigger challenge: build a computer that could beat the champions at Jeopardy.
February 2011: Victorious at Jeopardy
Watson competes on Jeopardy and defeats the TV quiz show’s two biggest all-time champions. It wins US$1 million; IBM donates the full amount to charity.To read this article in full or to leave a comment, please click here
IBM may have originally built Watson to win at Jeopardy, but it saw potential applications in healthcare early on. Eventually, it formed a dedicated business unit focused squarely on making those applications happen.
As far back as 2012, Memorial Sloan-Kettering Cancer Center and IBM teamed up to develop a Watson-based system that could help doctors create individualized cancer treatment recommendations for their patients.
The following year, IBM, Memorial Sloan-Kettering and WellPoint introduced products based on Watson. A project with Cleveland Clinic, meanwhile, focused on developing a new tool to help physicians and medical students learn how to make better decisions more quickly.To read this article in full or to leave a comment, please click here
How did IBM's Watson get to where it is today? Here are some key events that happened along the way.May 1997: Deep Blue conquers chess IBM's Deep Blue computer beats world chess champion Garry Kasparov in a six-game match that lasts several days and receives massive media coverage around the world. It also inspires researchers at IBM to undertake an even bigger challenge: build a computer that could beat the champions at Jeopardy.February 2011: Victorious at Jeopardy Watson competes on Jeopardy and defeats the TV quiz show’s two biggest all-time champions. It wins US$1 million; IBM donates the full amount to charity.To read this article in full or to leave a comment, please click here
IBM may have originally built Watson to win at Jeopardy, but it saw potential applications in healthcare early on. Eventually, it formed a dedicated business unit focused squarely on making those applications happen.As far back as 2012, Memorial Sloan-Kettering Cancer Center and IBM teamed up to develop a Watson-based system that could help doctors create individualized cancer treatment recommendations for their patients.The following year, IBM, Memorial Sloan-Kettering and WellPoint introduced products based on Watson. A project with Cleveland Clinic, meanwhile, focused on developing a new tool to help physicians and medical students learn how to make better decisions more quickly.To read this article in full or to leave a comment, please click here
The next release of OpenStack made its debut on Thursday with a raft of new features for better scalability and resiliency.Architectural and functional barriers can make it difficult for companies to scale their clouds up or down across platforms and geographies, but OpenStack's 14th release -- dubbed Newton -- does away with many of those limitations. The open source cloud-building software now includes improved scaling capabilities in its Nova, Horizon, and Swift components, its makers say.New improvements bolster the horizontal scale-out of Nova compute environments, while others add convergence by default in the Heat orchestration service as well as multi-tenancy improvements in Ironic.To read this article in full or to leave a comment, please click here
The next release of OpenStack made its debut on Thursday with a raft of new features for better scalability and resiliency.Architectural and functional barriers can make it difficult for companies to scale their clouds up or down across platforms and geographies, but OpenStack's 14th release -- dubbed Newton -- does away with many of those limitations. The open source cloud-building software now includes improved scaling capabilities in its Nova, Horizon, and Swift components, its makers say.New improvements bolster the horizontal scale-out of Nova compute environments, while others add convergence by default in the Heat orchestration service as well as multi-tenancy improvements in Ironic.To read this article in full or to leave a comment, please click here
The data scientist role was thrust into the limelight early this year when it was named 2016's "hottest job," and there's been considerable interest in the position ever since. Just recently, the White House singled data scientists out with a special appeal for help.
Those in the job can expect to earn a median base salary of roughly $116,840 -- if they have what it takes. But what is it like to be a data scientist? Read on to hear what three people currently on the front lines had to say.To read this article in full or to leave a comment, please click here
Aug. 25 may be Linux's official birthday, but Oct. 5 is in many ways the day it began to make a real mark on the world. That's when Linux creator Linus Torvalds officially released the first Linux kernel into the wild."As I mentioned a month(?) ago, I'm working on a free version of a minix-lookalike for AT-386 computers," Torvalds wrote in a newsgroup post on Oct. 5, 1991. "It has finally reached the stage where it's even usable (though may not be depending on what you want), and I am willing to put out the sources for wider distribution."To read this article in full or to leave a comment, please click here
Achieving balance between work and home life is an ongoing challenge for professionals across industries, but it turns out the IT world is doing pretty well in helping to make it happen.To read this article in full or to leave a comment, please click here(Insider Story)
It's been nearly two years since President Obama created the U.S. chief data scientist role, and the man currently in the job had an urgent message Thursday for attendees at Strata+Hadoop World: We need you."We are at the first step in making data work for every American," said DJ Patil in a keynote speech at the show. "It's only going to make a difference when people like you step up and show that it's not just feasible but scalable."As chief data scientist in the White House Office of Science and Technology Policy, Patil's mission is to "responsibly unleash the power of data to benefit all Americans," he said, with an emphasis on the word "responsibly" and the focus on inclusion.To read this article in full or to leave a comment, please click here
Hard on the heels of the discovery of the largest known data breach in history, Cloudera and Intel on Wednesday announced that they've donated a new open source project to the Apache Software Foundation with a focus on using big data analytics and machine learning for cybersecurity.Originally created by Intel and launched as the Open Network Insight (ONI) project in February, the effort is now called Apache Spot and has been accepted into the ASF Incubator."The idea is, let's create a common data model that any application developer can take advantage of to bring new analytic capabilities to bear on cybersecurity problems," Mike Olson, Cloudera co-founder and chief strategy officer, told an audience at the Strata+Hadoop World show in New York. "This is a big deal, and could have a huge impact around the world."To read this article in full or to leave a comment, please click here
Hard on the heels of the discovery of the largest known data breach in history, Cloudera and Intel on Wednesday announced that they've donated a new open source project to the Apache Software Foundation with a focus on using big data analytics and machine learning for cybersecurity.Originally created by Intel and launched as the Open Network Insight (ONI) project in February, the effort is now called Apache Spot and has been accepted into the ASF Incubator."The idea is, let's create a common data model that any application developer can take advantage of to bring new analytic capabilities to bear on cybersecurity problems," Mike Olson, Cloudera co-founder and chief strategy officer, told an audience at the Strata+Hadoop World show in New York. "This is a big deal, and could have a huge impact around the world."To read this article in full or to leave a comment, please click here
Big data is in many ways still a wild frontier, requiring wily smarts and road-tested persistence on the part of those hoping to find insight in all the petabytes. On Tuesday, IBM announced a new platform it hopes will make things easier.Dubbed Project DataWorks, the new cloud-based platform is the first to integrate all types of data and bring AI to the table for analytics, IBM said.Project DataWorks is available on IBM's Bluemix cloud platform and aims to foster collaboration among the many types of people who need to work with data. Tapping technologies including Apache Spark, IBM Watson Analytics and the IBM Data Science Experience launched in June, the new offering is designed to give users self-service access to data and models while ensuring governance and rapid-iteration capabilities.To read this article in full or to leave a comment, please click here
Robots' potential to take over the world is a commonly expressed fear in the world of AI, but at least one Turing Award winner doesn't see it happening that way. Rather than replacing mankind, technology will create a new kind of human that will coexist with its predecessors while taking advantage of new tech-enabled tools.So argued Raj Reddy, former founding director of Carnegie Mellon University's Robotics Institute and 1994 winner of the Turing Award, at the Heidelberg Laureate Forum in Germany last week.To read this article in full or to leave a comment, please click here
Technology has considerable potential to make the world better, but those benefits are far from guaranteed. Plenty of downsides can pop up along the way, and some of them have Turing Award winners especially worried.1. The internet echo chamber "Technology by itself is not evil, but people can use it for bad things," Barbara Liskov, an Institute Professor at MIT, told an audience of journalists Thursday at the Heidelberg Laureate Forum in Germany. "I do worry a lot about what's going on."The ability to selectively filter out news and opinions that don't agree with one's own viewpoint is one of Liskov's top concerns.To read this article in full or to leave a comment, please click here
Vint Cerf is considered a father of the internet, but that doesn't mean there aren't things he would do differently if given a fresh chance to create it all over again."If I could have justified it, putting in a 128-bit address space would have been nice so we wouldn't have to go through this painful, 20-year process of going from IPv4 to IPv6," Cerf told an audience of journalists Thursday during a press conference at the Heidelberg Laureate Forum in Germany.IPv4, the first publicly used version of the Internet Protocol, included an addressing system that used 32-bit numerical identifiers. It soon became apparent that it would lead to an exhaustion of addresses, however, spurring the creation of IPv6 as a replacement. Roughly a year ago, North America officially ran out of new addresses based on IPv4. To read this article in full or to leave a comment, please click here