This time around, the major changes are inside: A bump up in the processor to an 8th-generation Core chip, some weird adjustments in pricing, and a new color— black—separate the new from the old. There's actually a downgrade of sorts in the GPU compared to the Surface Pro (2017), which is a bit of a disappointment. The Performance section of our review shows the clearest differences among the three generations. We've given the Surface Pro 6 what some would consider an "average" score of 3.5 stars, a lower score than we've given some other tablet PCs we've reviewed recently. But we're also giving it an Editor's Choice, like those other products. Despite being underwhelmed by the Surface Pro 6's failure to break new ground (or even add USB-C), we will give it this: It also has a nice, long 8.5 hours of battery life in our tests, which has been an Achilles heel with reviewed competition. It is still one of the best-designed Windows tablets you can buy, and its pricing is competitive with similarly configured products.
To focus this new effort, MCS will pursue two approaches for developing and evaluating different machine common sense services. The first approach will create computational models that learn from experience and mimic the core domains of cognition as defined by developmental psychology. This includes the domains of objects (intuitive physics), places (spatial navigation), and agents (intentional actors). Researchers will seek to develop systems that think and learn as humans do in the very early stages of development, leveraging advances in the field of cognitive development to provide empirical and theoretical guidance. “During the first few years of life, humans acquire the fundamental building blocks of intelligence and common sense,” said Gunning. “Developmental psychologists have founds ways to map these cognitive capabilities across the developmental stages of a human’s early life, providing researchers with a set of targets and a strategy to mimic for developing a new foundation for machine common sense.”
Changing mindsets is a key enabler of new technologies and one of the ways Gartner recommended that IT executives change the culture of their companies. “Hack your culture to change your culture,” said Kristin Moyer, research vice president and distinguished analyst at Gartner. “By culture hacking, we don’t mean finding a vulnerable point to break into a system. It’s about finding vulnerable points in your culture and turning them in to real change that sticks.” Hacking is about doing smaller actions that usually get overlooked Moyer said. Great hacks also trigger emotional responses, have immediate results and are visible to lots of people at once, she said. Gartner says culture is identified by 46 percent of CIOs as the largest barrier to getting the benefits of digital business. Achieving culture change is tied closely to another key direction organizations should strive to achieve – the ability to embrace change and adopt technology in a new way or what Gartner calls “dynamism.”
The first is improving the ability of individuals to access data. "Today, finding a document could be tedious [and] analyzing data may require writing a script or form," Lazar said. With AI, a user could perform a natural language query -- such as asking the Salesforce.com customer relationship management (CRM) platform to display third quarter projections and how they compare with the second quarter -- and generate a real-time report. Then, asking the platform to share this information with the user's team and get its feedback could launch a collaborative workspace, Lazar said. The second possible benefit is predictive. "The AI engine could anticipate needs or next steps, based on learning of past activities," Lazar said. "So if it knows that every Monday I have a staff call to review project tasks, it may have required information ready at my fingertips before the call. Perhaps it suggests things that I'll need to focus on, such as delays or anomalies from the past week."
What gives machines -- and process automation -- the edge over humans? In addition to their ability to integrate data, machines, Levav noted, lack biases such as the illusion of validity, which leads people to overestimate their forecasting prowess. Yet, humans are still required in process automation, because only they can decide the important parameters, he added. "You will have a job because machines can't pick the variables that are relevant to a problem," he said. Scott Hartley, partner at venture capital firm Two Culture Capital, shared a similar view regarding the impact of AI on jobs. His take on AI-infused automation and employment takes a cue from Voltaire. Hartley's 2017 book, The Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World, cites a statement attributed to the 18th century philosopher to support his view that asking the right questions about data is central to acquiring knowledge. Making AI and machine learning work, Hartley said during a UiPath panel discussion, is "still fundamentally rooted in our ability to create diverse teams and ask questions from a multiplicity of angles."
But on a positive note, more board members see how technology is unlocking new business models and spurring growth. They are convinced of the growing need to focus on speed, agility, innovation, and customer obsession, and see that it requires new approaches to business operations and to IT investment. Technology and cybersecurity have historically been seen as compliance issues under the purview of the board’s audit committee. However, given the increasing capability of technology to affect revenue and the business model, there is a greater recognition of its strategic importance. This has led to an increase in the number of CIOs and other technology experts being appointed to boards. Still, though, the majority of boards lack the technology prowess needed to successfully guide today’s digital era company. What, then, can the CIO do to bridge the gap and develop a great relationship with the board?
We’re at what I call the hangover phase, where a night of cloud-hyped indulgence has led to many self-administered pats on the back, which obscured the reality that transitioning to the cloud is a harder than people originally thought. But the effort is still worth it. The budget overruns are no surprise, given that not much cost planning takes place during initial large cloud computing projects. Indeed, these initial projects fail to illustrate the true costs of using a public cloud, and if you look carefully you can see that the private clouds many such initial efforts focus on are just new cages of servers in data centers that cost more than the old cages of servers. Moreover, people costs are always higher than expected, and few enterprises plan to run both cloud and on-premises systems—but the reality is that you need to. What troubled me is that only 48 percent of the mid-sized businesses and only 36 percent of the large enterprises agree that cloud actually improved the business. I suspect that those who do not see the value have yet to complete a project’s successful journey to the cloud. But still, this figure should be higher.
"If someone has programming fundamentals then, from a technical point of view, I think that's enough for them to dive into machine learning," he says. "You're not gonna get very far if you can't program at all, because that's ultimately how you configure the machine-learning frameworks is through programming. "I think strong math was probably more essential before than it is now. It's certainly helpful to have mathematical knowledge if you want to develop custom layers or if you're really going very, very deep on a problem. But for people starting out, it's not critical." In some respects, it's just as important to have a willingness to seek out new information, says Yangqing Jia, director of engineering at Facebook. "As long as you keep an exploratory mindset there's such an abundance of tools nowadays you'll be able to learn a lot of things yourself, and you have to learn things yourself because the field is growing really fast."
“In microwave communications, an eavesdropper can put an antenna just about anywhere in the broadcast cone and pick up the signal without interfering with the intended receiver,” Mittleman said. “Assuming that the attacker can decode that signal, they can then eavesdrop without being detected. But in terahertz networks, the narrow beams would mean that an eavesdropper would have to place the antenna between the transmitter and receiver. The thought was that there would be no way to do that without blocking some or all of the signal, which would make an eavesdropping attempt easily detectable by the intended receiver.” Mittleman and colleagues from Brown, Rice University and the University at Buffalo set out to test that notion. They set up a direct line-of-site terahertz data link between a transmitter and receiver, and experimented with devices capable of intercepting signal. They were able show several strategies that could steal signal without being detected — even when the data-carrying beam is very directional, with a cone angle of less than 2 degrees
There is a massive amount of external TI that organizations can access to improve cyber defense. While cost can be a constraint for expensive commercial threat feeds, there is plenty of lower-cost and even free threat feeds available, from open source, government, and industry sources. While access to external TI is not an issue, the scale problem lies in managing, maintaining, and making effective use of TI. Some of these challenges include: Managing multiple threat feeds that come in different formats; Ensuring your threat feeds are constantly up to date; and Integrating TI into your security operations so that you can use it to improve security. The process of integrating TI into security operations is particularly interesting because it directly leads into another dimension of the network security TI scale problem. While organizations can turn to external TI to make up for the lack of access that a next-generation firewall provides, this same limitation hits you on the other side by hindering your ability to take action based on external TI. It's like a double firewall TI whammy!
Quote for the day:
“The only thing worse than training your employees and having them leave is not training them and having them stay.” -- Henry Ford