Daily Tech Digest - October 19, 2018

McAfee researchers uncover ‘significant’ espionage campaign

The researchers believe the new version could only have been created by having access to the original source code, which has been modified to make the malware more able to avoid detection. This behaviour is in line with other nation state operations, which tend to recycle and evolve code, the researchers said. According to the research report, Oceansalt was launched in five attack “waves” adapted to its targets. The first and second waves were spear phishing-based and began with a malicious Korean-language Microsoft Excel document created in May 2018 that acted as a downloader for the implant. The Excel document contained information that led McAfee researchers to believe targets were linked to South Korean public infrastructure projects. In all malicious documents, embedded macros were used to contact a download server and wire the Oceansalt implant to disk. Once connected, the implant was designed to send the IP address and computer name of the targeted machine, as well as the file path of the implant.

Audits: The Missing Layer in Cybersecurity

When organizations are astute enough to turn to their audit teams for cybersecurity support, auditors must be prepared to deliver value, aligned to the speed of their business. Just as the businesses that auditors support are rapidly transforming, the audit groups must follow suit. This can be challenging, considering many IT auditors received much of their professional training many years ago, when the word cybersecurity did not command the attention it does today, and before transformative technologies such as artificial intelligence, connected Internet of Things devices, and cloud-based platforms were so prevalent and impactful. Here's the good news: There are many more educational and training resources available today than 20 years ago, when I began in IT audit. Despite time and budget constraints, it is incumbent upon auditors to pursue the appropriate training and credentialing to transform their organizations, refresh their skill sets, and obtain the auditing cybersecurity acumen needed to become integral to their organization's cyber programs.

Best new Windows 10 security features: More patching, updating flexibility

Microsoft Windows 10 logo bandage data map
The Windows Defender Security Center has been renamed to merely Windows Security Center to better identify that it’s the main location for security information. Ransomware protection first introduced in 1709 has been simplified to make it easier to add blocked applications to the interface. Click “Allow an app” through “Controlled folder access.” After the prompt, click the + button and choose “Recently blocked apps” to find the application that has been blocked by the protection. You can then build in an exclusion and add them to the allowed list. Because time syncing is so key to both authentication as well as being a requirement for obtaining updates, the Windows Time service is now monitored for being in sync with the proper time. Should the system sense that the time sync service is disabled, you will get a prompt to turn the service back on. A new security providers section exposes all the antivirus, firewall and web protection software that is running on your system. In 1809, Windows 10 requires antivirus to run as a protected process to register.

Cloud Covered – Are You Insured?

We all know we need insurance, but what is the right-coverage for me? Well, it really depends on what are the type of assets you are trying to protect and how your business would be impacted if something happened. If we think about our daily lives, imagine having 20 doors/windows wide open and then just locking or adding video-surveillance to the one in the backyard (because your neighbor just told you he had been robbed the night before and that the thief broke into his house through the backyard door). Well, that’s a good start, however there are still more than 19 doors & windows still wide open and vulnerable for anybody to access right? Well, that’s pretty much what happens in IT and only securing a few “doors” is called “black-listing”. Let me explain: every server has 65535 ports open (for TCP and the same amount for UDP). If we consider the black-listing approach, we may just close a few ports based on common vulnerabilities knowledge. Most of the times, we don’t know which ports our apps need to work on, therefore we need to follow this approach and just block a few ports while permitting the rest of them.

Is Venture Capital investment in AI Realistic or Out of Control?

There are a few reasons why this investment might be rational. Just as the Internet and mobile revolutions in the past decades fueled trillions of dollars of investment and productivity growth, AI-related technologies are promising the same benefits. So this is all rational, if AI is the true transformative technology that it promises to be, then all these investments will pay off as companies and individuals change their buying behaviors, business processes, and ways of interacting. No doubt AI is already creating so-called “unicorn” startups with over $1 Billion in valuation. This could be justified if the AI-markets are worth trillions. So, what is this money being used for? If you ask the founders of many of these AI companies what their gigantic rounds will be used for you’ll hear things like geographic expansion, hiring, and expansion of their offerings, products, and services. As we’ve written about before, the difficulty in finding skilled AI talent is pushing salaries and bonuses to ridiculous heights.

20 innovative data centers that give us a glimpse into the future of computing

It is predicted that by 2025 data centers will consume one fifth of the Earth's total power. From cooling to lights to servers, there's no question that data centers eat up a lot of power. Recent news that climate change may be happening faster--and more severely--than initially believed makes traditional data center design, and its massive consumption of power, something that needs to be addressed. ... Project Natick is a Microsoft research endeavor that puts shipping container-sized pods filled to the brim with servers on the bottom of the ocean. The one active test machine currently in operation is just off the coast of Scotland, where Microsoft plans to leave it for up to five years for study. Project Natick servers require zero human interaction and are designed to remain in place for more than five years without the need for maintenance or repair. These servers can be powered by 100% renewable resources and emit zero emissions. According to Microsoft, "no waste products, whether due to the power generation, computers, or human maintainers are emitted into the environment."

Weighing the pros and cons of data security outsourcing

It’s nearly impossible to run a successful business operation in the current marketplace without taking IT seriously. The problem is that very few small and medium-sized businesses have the knowledge or skillset needed to properly manage each individual aspect of IT in-house. This is especially true when it comes to something like data security. One of the keys to running a successful business is being honest with yourself and recognizing what you don’t know. By identifying the areas where you come up short, you can take steps to compensate and overcome so that your business can thrive. One way you do this is through working with knowledgeable individuals that specialize in the areas where you’re deficient. Data security is a specific area where businesses often lack the internal knowledge and expertise to excel. It’s a particularly challenging aspect of IT that business leaders don’t have the time to master internally, so they go outside the company and outsource.

Review: Artificial Intelligence in 2018

Artificial Intelligence is not a buzzword anymore. As of 2018, it is a well-developed branch of Big Data analytics with multiple applications and active projects. Here is a brief review of the topic. AI is the umbrella term for various approaches to big data analysis, like machine learning models and deep learning networks. We have recently demystified the terms of AI, ML and DL and the differences between them, so feel free to check this up. In short, AI algorithms are various data science mathematical models that help improve the outcome of the certain process or automate some routine task However, the technology has now matured enough to move these data science advancements from the pilot projects phase to the stage of production-ready deployment at scale. Below is the overview of various aspects of AI technology adoption across the IT industry in 2018. ... AI algorithms have mostly surpassed the stage of pilot projects and are currently on various stages of company-wide adoption.

Why CIOs need to find ways to manage the old and the new

Research from Henley Business School and McKinsey shows that to be agile, businesses are choosing not to re-engineer legacy systems, said Manwani: “They either add another interface or do something totally separate.” But this is not a sustainable approach to managing digitisation initiatives, he said. “You can’t keep doing this. Without the engagement of an enterprise architect, businesses will reduce their agility in the medium term.” Just as restructuring of the IT department will never happen on its own, Manwani said: “You should not do major transformation piecemeal.” The enterprise architect's role is to present a coherent plan that can be used as a blueprint to underpin a digital transformation initiative, he added. “When we teach practitioners in the architecture space, it takes some time for them to absorb that they can, and should, engage in strategy development,” he said. “Preparing an architecture target state linked to the strategy is essential. This often requires new capabilities and mindsets.”

Should robots have rights?

California recently passed Senate Bill 1001, which bars companies and people from using bots that intentionally mislead those they are talking to into thinking they are human.  Putting aside the social and legal merits of this law, Bill 1001 implicates a hitherto-abstract, philosophical debate about when a simulation of intelligence crosses the line into sentience and becomes a true AI. Depending upon where you draw the line, this law is either discrimination against another form of sentient life or a timely remedy intended to protect users from exploitation by malevolent actors using human speech simulators. Alan Turing — the father of artificial intelligence but better known to the public for his role in breaking the German naval codes during World War II — foresaw the implications of his theories, which are still foundational to computer science, and was the first to enter this debate. He proposed his eponymous Turing test for artificial intelligence in 1950.

Quote for the day:

"A leader is best when people barely know he exists, when his work is done, his aim fulfilled, they will say: we did it ourselves." -- Laotzu

No comments:

Post a Comment