What new design concepts are needed to address these emerging technologies and risks presented by the ever-connected smart home? It not about informing everyone that everything can be hacked. It’s about creating awareness and helping others understand the risks involved with modern technology. It’s about helping the builders and designers understand the technological solutions required by their clients and how to implement them correctly, so they too can educate the user on how to maintain their system safely and securely. Consumers need to be aware of their devices’ remote and local environment, and how their data is collected and stored. They also need to be aware of how their personal devices and appliances can be affected by outages outside of their control, like a DDoS attack on a cloud environment or something as simple as a power outage. Finally, we as a community need to put pressure on the manufacturers to produce secure devices with clear plans on how to patch and mitigate future vulnerabilities. Manufacturers also have to begin working together to insure user data and integrate it in a crowded environment of smart links and physical devices, ultimately preventing remote access.
Native tools such as PowerShell and Windows Management Instrumentation (WMI) grant users exceptional rights and privileges to carry out the most basic commands across a network. These “non-malware” or fileless attacks account for more than 50% of successful breaches, the report said, with attackers using existing software, allowed applications and authorised protocols to carry out malicious activities. In this way, attackers are able to gain control of computers without downloading any malicious files and therefore remain unnoticed by malware-detection security systems. ... Finally, PowerShell is used to connect to a command and control server to download a malicious PowerShell script designed to find sensitive data and send it to the attacker, all without downloading any malware. Almost every Carbon Black customer (97%) was targeted by a non-malware attack during each of the past two years, but the report notes that awareness of malicious usage for tools such as PowerShell has never been higher, with 90% of CISOs reporting seeing an attempted attack using PowerShell.
One potential value of prescriptive analytics is that you don’t necessarily need a ton of data to reach the best decision or outcome. Prescriptive analytics focuses the question you’re asking, and the decisions you’re trying to reach, to one tangible answer using a smart model of your business that is not dependent on the amount of data (how much or how little) that you have. Predictive techniques and functionalities can be great at identifying a multitude of options through statistical modeling and forecasting, as long as you have the relevant data—but that’s precisely the problem. It’s difficult to process and synthesize numerous options and the nuanced differences among them to determine what you should actually do. How can you be sure that you’re making the best decision? How can you be sure of the impact it will have on your company? Prescriptive analytics can involve hundreds of thousands of tradeoffs associated with a question you might have, and it uses the available data to identify the best decision and impact relative to the goal you’re trying to achieve.
With Brexit looming over both small and large businesses operating within the UK, maintaining a competitive price point despite new tariffs will be a tough challenge to overcome. However, with new technological solutions opening up greater levels of efficiency and productivity this task should not appear as daunting as many believe. Tech companies will continue to innovate new solutions for both current and future business issues, finding new ways to improve efficiency within the business environment. Voice recognition technology such as iBOB, for instance, is now freeing up valuable time for business owners and receptionists. It is reducing the need for customer service interactions on the phone, such as appointment bookings, which is making the human impact much more impactful and is allowing small businesses to focus their resources on more profitable aspects of their business. Employing affordable technological solutions, with an aim to focus added work hours on tasks more closely related to the bottom line, will allow existing businesses to maintain their competitive position within the market.
“It is not your data or my data; it is the firm’s data, and the value you create for the business is from that data,” Tewary explained. “It is a transformation. It’s changing the people culture aspect, so there’s a lot of education. You know, you have to be an evangelist. You wear multiple hats to show people the value.” For Tewary at Verizon, finding advocates within the company for sharing big data was crucial. “We found champions,” he said. “For example, finance … was a good champion for us, where we used the data and analytics to really actually launch some very critical initiatives for the firm — asset-backed securities. … That created the momentum.” Dobrin agreed with this strategy of using champions within an enterprise to help lead the way for the entire company. “It’s not just a jump to the top of the ladder, because there’s just a lot of work that’s required to do it. You can do that within a business unit.” While the whole enterprise doesn’t need to get there all at the same time, as other areas of the enterprise begin to see the use of big data and how it can change the game, they will be open to the idea, Dobrin explained.
To stay competitive in networking and to avoid being obsoleted by history, network equipment vendors have either blazed the trails for SDN, or found themselves adopting SDN reluctantly, perhaps looking a little singed in the process. One vendor clearly in the former camp, not the latter, is Juniper Networks. It plunged into the SDN field during the fateful year of 2012, first by purchasing a firm called Contrail, and then by building it into an open-source virtual appliance ecosystem unto itself: OpenContrail. As the diagram above depicts, OpenContrail serves as a device that provides the routing logic for distributed operating systems that host Docker containers. ... "It's a big part of operating and automating both a virtual and a physical infrastructure. It orchestrates the VNFs [virtual network functions] and puts together the service chains, all the way to the edge and to the core. Contrail uses vRouter and, in a distributed data center infrastructure, reach into any part of the cloud, string up the required VNFs, stitch together the different pieces of the service, and deliver a custom service to a certain vertical, or a group of end customers. It automates that whole process of customizing the services that can be offered, ultimately, to our service provider customers."
Perhaps there won’t be a “mass market” for consumer goods anymore; just a mass of individuals who are increasingly difficult to categorize, and who reinvent themselves from moment to moment, from platform to platform. People will still want to gather in groups with like-minded people. But they will find them through technology and data and connect with them based on their shared values and interests rather than practical connections, such as living in the same area. Rather than being defined by markers such as gender, age or location, they will express themselves in ways that are more fluid and flexible. In one of the future worlds we modeled at our hack week in Berlin, these groups – or “tribes” – broke down physical borders and formed their own communities, both real and virtual. They started to pool their purchasing power and demand a different relationship with brands. Today, consumer-facing companies try to tailor offers and discounts that will appeal to individual consumers, based on their purchasing data – with varying degrees of skill and success. In the future, will products themselves be individualized?
Instead of using containers to run applications, serverless computing replaces containers with another abstraction layer. Its functions or back-end services are one-job programs, which use compute resources without worrying the developer. Instead of calling functions in the traditional sense, in serverless, a developer calls a working program to provide a service for the program they're building. The Cloud Native Computing Foundation (CNCF) Serverless Working Group defines serverless computing as "building and running applications that do not require server management. It describes a finer-grained deployment model where applications, bundled as one or more functions, are uploaded to a platform and then executed, scaled, and billed in response to the exact demand needed at the moment.” Or for another definition: "Serverless architectures refer to applications that significantly depend on third-party services,” says Mike Roberts, engineering leader and co-founder of Symphonia, a serverless and cloud architecture consultancy. “By using these ideas, and by moving much behavior to the front end, such architectures remove the need for the traditional 'always on' server system sitting behind an application.”
Nearly two months after critical Drupal fixes were released, security firm Malwarebytes says it is still finding dozens of unpatched websites that have been exploited to host cryptocurrency miners or in other cases redirect to malware (see Cryptocurrency Miners Exploit Widespread Drupal Flaw). The problems stem from two critical vulnerabilities in Drupal, both of which are remotely executable. That's a perfect combination for attackers: Give them a widely used piece of software such as Drupal, as well as known vulnerabilities that can be easily and remotely exploited without even needing to attempt to trick a victim into taking any action. The first flaw, CVE-2018-7600, was revealed March 28, and the second, CVE-2018-7602, on April 25. The vulnerabilities were so severe that they were dubbed Drupalgeddon 2 and Drupalgeddon 3. Although patches have been available since the vulnerabilities were publicized, attackers are still taking advantage of websites that haven't been upgraded. "Rolling out a CMS is the easy part," writes Jerome Segura, lead malware intelligence analyst with Malwarebytes, in a blog post. "Maintaining it is where most problems occur due to lack of knowledge, fear of breaking something and, of course, costs."
Artificial intelligence (AI) is a branch of computer science that focuses on the theory and development of computer systems that are capable of performing tasks that normally require human intelligence, such as visual perception and decision-making. Machine Learning is a subset of AI that focuses on the practice of using algorithms to parse data, learn from it, and then make a prediction about something. In contrast to a static algorithm, a critical aspect of machine learning is that the machine is “trained” using large amounts of data and algorithms that give the machine the ability to continually learn how to perform a given task. Tools based on machine learning are necessary to supplement the existing set of security tools. These new tools help organizations identify and mitigate the emerging generation of security breaches that are designed to leverage both the legacy and evolving attack surfaces to evade the enterprise’s traditional defenses. When evaluating security tools based on machine learning, there are three key concepts that IT organizations should keep in mind.
Quote for the day:
"There are plenty of difficult obstacles in your path. Don't allow yourself to become one of them." -- Ralph Marston