Work in Transition
It’s likely that work done by humans will increasingly involve innovative thinking, flexibility, creativity, and social skills, the things machines don’t do well. In a recent study on automation from the University of Oxford, researchers tried to quantify how likely jobs are to be computerized by evaluating how much creativity, social intelligence, and dexterity they involve. Choreographers, elementary school teachers, and psychiatric social workers are probably safe, according to that analysis, while telemarketers and tax preparers are more likely to be replaced. Most professions won’t go the way of the telemarketer, but the work involved is likely to migrate toward the tasks humans are uniquely skilled at, with automation taking over tasks that are rules-based and predictable.
Honda Using Experimental New ASIMO for Disaster Response Research
The robot was never intended to be a disaster mitigation robot; it was designed to work in offices, specifically the kind of offices that have notexperienced an earthquake, explosion, alien invasion, sharknado, or other messy event. Honda is clearly aware of ASIMO’s limitations in tackling these kinds of situations, and that’s probably why (as we reported two years ago) the company has been developing a new version of ASIMO that is specifically designed for disasters. At the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) this week, Honda engineers presented a pair of papers on research they’re doing with disaster-response humanoid robots.
Big Data Really Freaks This Guy Out
Ceglowski, who comes across a bit like a paranoid Ray Romano, also has an issue with the validity of the findings from big data. “There’s a con going on here,” he said. “On the data side they say, ‘Hey just collect everything. Collect all the data and we have these magical algorithms that will find everything in it for you.’ But on the algorithm side, where I am, they tell us ‘Throw any code you have at it–we have this awesome training data and we have enough of it that you’re sure to surface something interesting.'” The problem, Ceglowski said, is that any big data analysis that involves people has a built-in self-destruct mechanism. “Human beings always ruin everything,” he said.
Microservices: Simple servers, complex security
As Garrett explained, "The attack surface of a microservices app can be much greater [than a traditional monolithic application]." With older apps, "the attack surface is very linear -- traffic hits the load balancer, then the Web (presentation tier), and then the application and data tiers." But with microservices, Garrett noted the flow is entirely different: "It's generally necessary to expose a large number of different services so that external applications can address them directly, leading to a much greater attack surface." "If you break up your application into smaller services," said Kelsey Hightower, product manager and chief advocate for CoreOS, "you'll need a more robust authentication/authorization solution between each service.
Emerging: DataOps and three tips for getting there
"DataOps is a data management method that emphasizes communication, collaboration, integration, automation and measurement of cooperation between data engineers, data scientists and other data professionals." As with any new approach, the pioneers haven't sorted out the language just yet: While Palmer refers to it as a "data management method," Bergh calls it an "analytic development method" that should be overseen by a chief data officer or a chief analytics officer. (The Bergh team refers to DataOps as AnalyticOps.) In either case, the ultimate goal is to accelerate analytics. And, regardless of how businesses decide to practice DataOps, successful programs will require IT expertise in the form of data integration, data quality, data security and data governance, according to Palmer and Bergh.
SAP unveils SaaS analytics platform
"Whether you're in the boardroom or in front of a customer, there is a fit solution that will meet your needs," Smith says. "By bringing these capabilities together on a single platform, there's advantages those users can get — commonalities in collaboration and business process and workflow that can help bring these capabilities together."... "SAP Cloud for Analytics was transformational by allowing real-time updates to our plans, collaboration across the organization from within the app, advanced analytics and one-click visualization for our users," Stephen Hayes, analytics manager, Live Oak Bank, said in a statement today. "The end-user experience was well-received from our leadership team to our analysts."
VMware Value Lies In Modern Data Center Management
Unlike the IBM mainframe, VMware is a software company, one that so far has been able to evolve its product lines rapidly. For example, on Tuesday, Oct. 13, at VMworld in Barcelona, VMware introduced vRealize Automation 7, which gives enterprise IT or a DevOps team the ability to generate a graphical blueprint that can lead to a deployable system. On the blueprint a team can identify different parts of an application spread over many machines, then assign the application-appropriate networking and security. The system described in the blueprint will tend to be on-premises, but parts of it can exist in Amazon Web Services or an OpenStack Kilo cloud.
The missing ingredient for effective problem management
The missing ingredient in a typical implementation is skilled problem managers using a consistent, evidence-based, structured approach to solving problems. By structured, I mean either to adopt one of the major problem-solving frameworks such as Kepner and Fourie and follow it consistently, with all problem managers using it the same way and all the time. ... Technical knowledge is useful to give the confidence to challenge subject matter experts, particularly if they are invoking their deep technical knowledge to suggest that their opinion should be accepted without question. Problem managers should always seek evidence to support assertions, ensure that alternatives are properly assessed and that actions proposed are sensible
Q&A and Book Review of Software Development Metrics
Velocity is another metric that depends on certain assumptions. It depends on (1) a time-boxed process model, and (2) incremental delivery of production-ready features at least to a test environment. Provided these assumptions hold, Velocity is useful for short-term planning and also to accumulate empirical data for burn charts, which in turn can expose emerging delivery risks. So, it's useful for steering in cases when the work is done in a certain way. In my experience, Velocity is a little too slippery to use for tracking improvement. There are three reasons. First, a team's improvement efforts might include changing the length of the time-boxed iterations or shifting away from a time-boxed model altogether.
Tech Firms Laud Obama's Retreat on Encrypted-Data Law
Battered by Edward Snowden’s revelations that they aided in NSA surveillance, technology companies have leaped at the chance to showcase features such as encryption that help deter hackers. Apple, for example, helped set off the debate by announcing that iPhones would automatically encrypt data stored on them and that Apple couldn’t help the government unlock the information. What companies have rarely mentioned is that the data sought most often by police and American intelligence services -- text messages, e-mails, photos and calling records -- can still be legally obtained with court orders. That’s true no matter how much encryption is used to prevent unauthorized parties from accessing them, as Bloomberg News reported last October.
Quote for the day:
"When you do the common things in life in an uncommon way, you will command the attention of the world." -- George W. Carver
No comments:
Post a Comment