By Scott D. Johnston For GH Newspaper Group In the world of organized high school robotics competitions, it may be that size doesn’t really matter. Smarts and savvy seem to be the secrets of success. How else to explain that members of the 4-H Ocosta “Fishy Business Inc.” … [Read More...]
In a world where there’s a smartphone app for everything, one company — Amazon.com Inc. — has long been the host for an outsized share of online software and computing services.
Microsoft CEO Satya Nadella [pictured above] wants to change that.
Nadella has poured billions of dollars into building new data centers around the world, hoping to position Microsoft as the leading alternative to Amazon in selling online computing power — housed in remote centers or “clouds” — to internet startups and big corporations, as well as consumers.
As evidence the investment is paying off, Microsoft Corp. reported Tuesday that its Azure cloud -computing business more than doubled in sales last quarter, compared with a year earlier. That growth, combined with increases in revenue from Windows software licenses and other key segments, helped offset a big decline in revenue from the Nokia smartphone business that Microsoft largely shut down last
To read more see the full post at: As PCs Decline, Microsoft Betting Its Future on the Cloud - CIO Today
As part of an expanded collaboration between IBM and CSC, Big Blue will offer its mainframe capabilities and cloud infrastructure to CSC and its clients. IBM will offer its Cloud Managed Services for z Systems to CSC customers making the transition to the cloud. The agreement includes mainframe hardware, software, monitoring, and governance support associated with the service, known as IBM Cloud for z.
For CSC, the agreement means reduced operational costs for clients transitioning to the cloud while at the same time improving its cloud IT infrastructure offerings. The collaboration comes on the heels of CSC’s announcement that it will be merging with Hewlett Packard Enterprise’s Enterprise Service Division, creating a new, pure-play IT services company.
Playing to Their Strengths
“Many clients are looking for ways to make historically fixed cost more variable to migrate from legacy platforms to modernized applications and to a cloud enabled infrastructure,”
To read more see the full post at: IBM and CSC Take Cloud Collaboration to the Mainframe - CIO Today
Nearly two dozen rising fifth- through 11th-graders gathered to build and program robots during the VEX Robotics Summer Camp at Orangeburg-Calhoun Technical College.
“I like programming,” camper Marcus Elmore said. “I’ve made a few games, and I make and sell maps that people can use for their games.”
Elmore has attended a robotics camp before, but that was different because the robots were already assembled and the campers just programmed them. This time, he and fellow campers were given boxes of parts, split into teams and required to assemble and program their creations.
Elmore is looking forward to the camp again next year — he’s even invited a friend. “I want to be a robotics engineer,” he said. “I’ve learned a lot.”
Stephanie Phillips, Project Lead the Way program coordinator and instructor at OCtech, said the campers have done a great
To read more see the full post at: Campers have fun building, programming robots at OCtech - The Tand D.com
KUKA ATX is a new robotics research and development center in Austin focused on industrial robots for factory production.
The German-based company established its newest center of excellence at 11921 North Mopac Expressway.
The center will focus on web, cloud and mobile software platforms to augment KUKA robotics, customer experience and overall productivity.
The Austin location has a team of software engineers, roboticists and product marketing experts. It also serves as “the primary advisers of technical and strategic oversight for KUKA’s U.S.-centric investments,” according to a news release.
KUKA ATX will be holding an event on August 24th to celebrate its new center. It will include tours, demonstrations and a presentation on “The Past, Present and Future of Robotics” by Robin Murphy, director of the Center for Robot-Assisted Search and Rescue (CRASAR) and the Center for Emergency Informatics. It will also include a panel discussion
To read more see the full post at: KUKA Opens Robotics Center in Austin - Silicon Hills News
As enterprises begin the serious work of moving beyond trial phases to truly integrate Internet of Things traffic, three key factors are coming to the fore. As I described in DZone’s Guide to The Internet of Things Volume III, the first is that the future will bring a huge increase in the number of sensors, actuators, and devices broadly categorized as the Internet of Things (IoT). The amount of data generated by these new endpoints is staggering. But more importantly, a very large percentage of these devices will be too small, too cheap, too dumb, and too copious to run the hegemonic IPv6 protocol. But somehow, this traffic must still be incorporated into enterprises’ networks.
Simultaneously, these same enterprises are increasingly moving their main networking and computing tasks to cloud network architectures. In order to maintain or increase control of these resources, many enterprises are rolling out variations of Software
To read more see the full post at: The Abstracted Network for Enterprises and the Internet of Things (Part 1) - DZone News
Verizon Wireless is sending out unusual emails to some of its most voracious users: Please stop.The company confirmed in a statement Friday that it is reaching out to a limited number of customers who use an “extraordinary” amount of data, informing them that they must switch to a tiered plan by Aug. 31 or their accounts will be suspended. If that happens, customers will have 50 days to reactivate their accounts under a tiered plan. If they don’t, their accounts will be terminated.Verizon is taking the action “because our network is a shared resource, and we need to ensure all customers have a great mobile experience with Verizon,” the statement said.The decision seems to be aimed at customers who have held on to an unlimited data plan that the telecom giant no longer offers and are using that plan to the fullest. According to Verizon, not many people are affected
To read more see the full post at: Verizon to big data users: Please stop - The Columbian
Wiseguyreports.Com Adds “Robotics in Healthcare -Market Demand, Growth, Opportunities and analysis of Top Key Player Forecast to 2020” To Its Research Database.
The China Robotics in Healthcare Industry 2016 Market Research Report is a professional and in-depth study on the current state of the Robotics in Healthcare industry.
The report provides a basic overview of the industry including definitions, classifications, applications and industry chain structure. The Robotics in Healthcare market analysis is provided for the China markets including development trends, competitive landscape analysis, and key regions development status.
Development policies and plans are discussed as well as manufacturing processes and Bill of Materials cost structures are also analyzed. This report also states import/export consumption, supply and demand Figures, cost, price, revenue and gross margins.
The report focuses on China major leading industry players providing information such as company profiles, product picture
To read more see the full post at: Robotics in Healthcare Industry 2016 Market Research Report - Medgadget (blog)
Judea Pearl and Elias Bareinboim
Bareinboim and Pearl discovered how to estimate the effect of one variable, X, on another, Y, when data come from disparate sources that differ in another variable, Z.
As the field of “big data” has emerged as a tool for solving all sorts of scientific and societal questions, one of the main challenges that remains is whether, and how, multiple sets of data from various sources could be combined to determine cause-and-effect relationships in new and untested situations. Now, computer scientists from UCLA and Purdue University have devised a theoretical solution to that problem.
Their research, which was published this month in the Proceedings of the National Academy of Sciences, could help improve scientists’ ability to understand health care, economics, the environment and other areas of study, and to glean much more pertinent insight from data.
The study’s authors are Judea Pearl, a distinguished
To read more see the full post at: Solving big data's 'fusion' problem - UCLA Newsroom
Written by Pratik Kanjilal | Published:July 23, 2016 3:49 am Anything which is logically connected to anything else is generating data and the more cash-sparse an economy is, the richer these connections are.
Big Data: Does Size Matter?
Author: Timandra Harkness
Publisher: Bloomsbury Sigma
Small Data: The Tiny Clues which Uncover Huge Trends
Author: Martin Lindstrom
Publisher: Hachette India
The short answer to the question posed on the cover of comedian and math proselytiser Timandra Harkness’ book is: of course, size doesn’t matter. It never matters, except in the King Kong and Godzilla movies. In all other things, it’s the approach which matters. Big Data approaches are defined by vastly distributed and parallelised storage and processing strategies. The size of the data set is secondary, but things really begin to buzz when volumes are overclocked.
Harkness, who has a delightfully light touch, points out that the volume of data out there – which
To read more see the full post at: The Volume Game: Book review of Big Data– Does Size Matter and Small Data - The Indian Express
Moving large datasets around HPC networks such as those of XSEDE is often challenging. While Internet2, the most commonly used backbone, is fast at 100GBS, local traffic to campuses often slows to 10GBS. At this week’s XSEDE meeting the DANCES (Developing Applications with Networking Capabilities via End-to-End SDN) project leaders reported first successful testing of hardware components needed as well as results of vendor bakeoff for switches.
The NSF-funded DANCES[i] project is intended to investigate and develop the ability to add network bandwidth scheduling via software- defined networking (SDN) programmability to selected cyberinfrastructure services and applications intended to develop a software/hardware infrastructure that speeds and smoothes high-speed traffic. The recent test involved transfer of “Big Data” between two XSEDE sites, the Pittsburgh Supercomputing Center (PSC) and the National Institute for Computational Sciences (NICS) at the University of Tennessee.
“If you think about [high-performance computing] users and
To read more see the full post at: DANCES with Big Data: Progress Update from XSEDE Meeting - HPCwire