Daily Tech Digest - October 22, 2018

Saudi SoftBank relationship and the Tesla miracle, are techs set to boom or crash? image
Human nature is a funny thing, and often has little to do with rational decision making. In 2008, the global economy descended into crisis, for a while it felt as if capitalism itself was tottering. The Queen of England famously asked: “Why didn’t anyone see it coming?” Actually, many did. But as a whole, economists and politicians who warned of a major crisis in the making, fell under the media radar or were dismissed as doomsayers. Yet, even among these ‘Cassandras’, few anticipated the full extent of the crisis to follow. There may be a good reason for this. A new book, Crisis of Beliefs: Investor Psychology and Financial Fragility, by the economists Nicola Gennaioli and Andrei Shleifer, argues that one of the reasons why the 2008 crisis was so severe is that people changed. It seems that human psychology may at least partially explain the crash of 2008, investors beliefs on the level of debt or leverage that was sustainable, for example, changed. And that’s the tricky thing about predicting stock markets. Human nature, especially when it is aggregated and subjected to forces such as group think, is notoriously difficult to understand, let alone predict.

Firms need stronger metrics and skills to outpace cyber threats

The use of security metrics and the formation of security teams should be viewed as complementary activities, though for many organizations some upskilling will be necessary, Robinson explained. "Foundational skills such as network security, endpoint security and threat awareness still form the bedrock of a strong team,” Robinson said. “But as the cloud and mobility have become ingrained into IT operations, other skills have taken on equal or greater importance.” In order to acquire the security skills organizations require, many are primarily looking to train current employees or expand their use of third-party security expertise. New hires and new partnerships are usually secondary considerations, Robinson explained. When it comes to the use of external resources, 78 percent of companies rely on outside partners for some or all of their security needs. Many firms rely on more than one partner, another indicator of the complexity of cybersecurity, Robinson explained.

FDA Calls for 'Cybersecurity Bill of Materials' for Devices

FDA Calls for 'Cybersecurity Bill of Materials' for Devices
"Because of the rapidly evolving nature of cyber threats, we're updating our [premarket] guidance to make sure it reflects the current threat landscape so that manufacturers can be in the best position to proactively address cybersecurity concerns when they are designing and developing their devices," says FDA Commissioner Scott Gottlieb, M.D. "This is part of the total product lifecycle approach to device safety, in which manufacturers must adequately address device cybersecurity from the design phase through the device's time on the market to help ensure patients are protected from cybersecurity threats." The draft guidance provides updated recommendations on cybersecurity considerations for device design, labeling and documentation that should be included in premarket submissions for agency approval of medical devices that have cybersecurity risk, FDA notes. The agency will conduct a public workshop for industry stakeholders on Jan. 29-30, 2019, to discuss the newly released draft guidance before it's finalized.

To beat Bloomberg, Symphony is letting banks’ bots talk with each other

“If you can create a whole ecosystem that connects every individual without dropping anyone, you can create a network much greater than what Bloomberg has done,” Gurle said in an interview on the sidelines of Symphony’s Innovate conference in New York recently. “The key is openness.” Gurle compares Symphony to America’s interstate highway system. In that analogy, the banks are cities and towns, and use their own cars to travel on a network Symphony has built. The advantage is that banks can use their own proprietary systems and still interface with other systems. Using Symphony, banks are deploying chatbots that “talk” amongst themselves to make and settle trades. Bots at RBC and AllianceBernstein, for example, can execute trades with each other over the Symphony platform, while BlackRock and BNP Paribas use them to settle mismatched foreign-exchange swaps.

How to make automation part of your microservices security

A modern application stack has four layers: infrastructure, data, networking and application code. At each of these layers, containers and microservices introduce a new way to deliver apps. As a result, container orchestration tools like Kubernetes are central to microservices management. While many security tools that work for standard applications produce effective results when applied to a microservices application, two aspects of microservices require additional attention and protection: application security and container security. Fortunately, there are plenty of advanced automation tools that support the fast and agile requirements of microservices security. Microservices application security is important because it involves multiple services rolled into one app. Those multiple services all work together to deliver a unified experience, and that means it's essential to perform dynamic testing on the services at the application level. In a microservices system, networking occurs between the services, as well as at the instance level.

Samsung Starts Mass Production of Chips Using Its 7nm EUV Process Tech

Samsung’s 7LPP manufacturing technology offers impressive advantages over the company’s 10LPE specifically for mobile SoCs. Meanwhile, in a bid to make the process attractive to a broad range of potential customers, the foundry offers a comprehensive set of design-enablement tools, interface IP (controllers and PHY), reference flows, and advanced packaging solutions. The final PDK is months away, but many customers may start development of their 7LPP SoCs even with existing set of solutions. At this point 7LPP is supported by numerous Samsung Advanced Foundry Ecosystem (SAFE) partners, including Ansys, Arm, Cadence, Mentor, SEMCO, Synopsys, and VeriSilicon. Among other things, Samsung and the said companies offer such interface IP solutions HBM2/2E, GDDR6, DDR5, USB 3.1, PCIe 5.0, and 112G SerDes. Therefore, developers of chips of SoCs due in 2021 and onwards, which will rely on PCIe Gen 5 and DDR5, can start designing their chips right now.

9 Principles of Service Design

It’s important to realize services are not tangible goods. An interface is not a service. A product is not a service. Shostack states, “People confuse services with products and with good manners. But a service is not a physical object and cannot be possessed. When we buy the use of a hotel room, we take nothing away with us but the experience of the night’s stay. When we fly, we are transported by an airplane but we don’t own it. Although a consultant’s product may appear as a bound report, what the consumer bought was mental capability and knowledge, not paper and ink. A service is not a servant; it need not be rendered by a person. Even when people are the chosen means of execution, they are only part of the process.” This makes it quite difficult to design for services. Often, the design of a service is overlooked by organizations and decisions related to the service supporting a product are not routinely considered in relation to how they impact the overall design of an experience. This results, most often, in poor service design and a poor experience.

Why Managed Threat Hunting?

Increasingly, threat hunting is a practice that enterprises want to understand and implement. But it is not always feasible to do so in-house, given the demand for resources and skills. That's where managed threat hunting enters, says CrowdStrike's Jennifer Ayers. Ayers, VP, OverWatch and Security Response at CrowdStrike, says the in-house/managed services decision is becoming a common, pragmatic discussion. "Companies want to be able to build out all this stuff, but in reality, if you only have $100, do you want to focus that $100 on building out a threat hunting organization that might only find evil once or twice a year in your particular environment, or do you want to use that funding to shore up your defense and response to those types of attacks?" In an interview on managed threat hunting, Ayers discusses: Her perspective on threat hunting; In-house vs. outsourced threat hunting; and The latest threats and how to defend against them.

Public cloud management tools lacking, research finds

Network engineer Brian Keys took a look at network resiliency and why it's so difficult for enterprises to have a network that's highly available. For one thing, nobody wants to pay for the technology necessary to achieve that goal. Additionally, finding architects with the experience to design a highly available network isn't easy. Still, Keys said, enterprises can take steps to improve their network's reliability. The use of uninterruptable power supplies is a good approach. So are redundant links for branch office connectivity. But knowing which techniques are necessary and which ones are just nice to have requires careful study. "A competent network designer should be able to tell with a high degree of certainty just how resilient the network is and in which ways," Keys said. "Probably the toughest part is to explain to upper management the pros and cons of the new proposal and get their buy-in."

The Hub of All Things: Are you collecting personal data the wrong way?

Are you collecting personal data the wrong way? image
It’s a radical shift from the way organisations collect and access personal data, but Holtby thinks is better not only for consumers but for organisations too. He explained: “Most companies treat the personal data of their users in a way that is, at best, hamstrung and at worst completely dysfunctional. “I would argue in the future most companies are going to want to have a pretty clear understanding of who their users are and who their customers are. They want to know as much as they can about those people. At best even the very biggest companies, today, have a very limited understanding of who their users are. “The quintessential ‘I know who my user is kind of company’, I would argue at the moment, is Google. Many think of Google as being the company that has the most data about its users. If you are being charitable to Google you could say it knows everything it would possibly want to know about its users, but in reality, all they have is Google’s data.

Quote for the day:

"Leaders must be good listeners. It_s rule number one, and it_s the most powerful thing they can do to build trusted relationships." -- Lee Ellis

Daily Tech Digest - October 21, 2018

Do you have a lot of receipts, business cards, or other printed documents that you want to digitize and store? If so, one tool up to this job is Microsoft's free mobile Office Lens app. With Office Lens for iOS or Android, you can use your device's camera to snap a photo of a note, card, or other document. You can capture the image as a whiteboard, a document, a business card, or a photo. Then, you can then edit and revise it by cropping it, flipping it, drawing on it, and adding text to it. When you're done, you can store the image as a PDF file, a Word document, a PowerPoint slide, or a OneNote file. You can also save the image to your mobile gallery or to Microsoft OneDrive; in fact, the latest version of OneDrive for iOS directly integrates Office Lens. Let's look at how to use Office Lens to capture your printed documents. First, download and install the Office Lens app on your iPhone, iPad, or Android device. Open the app and give it the necessary permission to access your photos and camera. 

Continuous Integration at Intel for the Mesa Graphics Library

Mesa CI is a set of configuration files, a job scheduler and a job implementation that can run on Jenkins. Written mostly in Python, it is driven by the principle that "the most important design consideration for the Mesa CI is to minimize configuration in Jenkins". The Mesa CI can theoretically run on top of any CI infrastructure, not just Jenkins, according to the documentation. It’s currently used for developer testing, release verification, pre-silicon (hardware) testing in simulators for Intel drivers, performance testing and validation of conformance test suites. The typical developer testing turnaround time is 30 minutes even though a commit to the master branch kicks off millions of tests. A custom database provides immediate access to test history, and the system also generates performance trend lines for common benchmarks.

Integrating factor of Big data and Artificial intelligence in Business

Now, companies and businesses have the chance to explore the potential of AI with seemingly inexhaustible data as opposed to what was previously obtainable thus unraveling all the intricate aspects of the process. As opposed to depending on sample data, experts are now able to utilize unquantifiable amounts of data. This advancement has propelled enterprises to a point where they can deliver content without any form of irrelevant data while offering higher suggestive and extrapolative data that is valuable for interpreted with the aid of “analytical sandboxes” or big data “centers of excellence”.  ... The landscape of artificial intelligence has experienced an explosive advancement with the accessibility to big data thus triggering disorderly transformations. The widespread explosion of data, in addition to the advancement in the capacity to store and evaluate staggering amount of data with efficiency and pace is directly responsible for propelling the relevancies of AI. This transcends the conventional role of analyzing data. More than ever before, AI is becoming an invaluable tool for accurate assessment and decision taking.

Why Digital Banking Should Include A Human Component

While effective implementation of digital strategy is critical for banks and credit unions, human interaction cannot be ignored. Technology can be used to augment the human experience and empower both customers and employees. A most common example is customer service; while chatbots and AI can be deployed to address most of the use cases, we must ensure that there are options for humans to intervene when needed, as well as human touchpoints throughout the customer journey to build trust and rapport. Bots need to be trained to learn how to empathize, and understand regional and generational differences. After all, such technology should reflect a brand’s identity and can positively (or negatively) impact customer perception. At the end of the day, technology is just a means to an end. The winning formula is not about more or less shiny new toys – but rather, leveraging appropriate technology to meet the needs of customers. The future of finance is also not about having a pretty user interface or making small incremental changes.

How Close Are We to Kubrick's AI-Controlled Vision of the Future?

HAL learned from observing its environment, watching and analyzing the words, facial expressions and movements of the human astronauts on the spaceship. It was responsible for performing rote functions such as maintaining the spaceship, but as a "thinking" computer, HAL also was capable of responding conversationally to the astronauts, Murphy explained. However, when the mission goes awry and the astronauts decide to shut HAL down, the AI discovers their plot by lip-reading. HAL arrives at a new conclusion that wasn't part of its original programming, deciding to save itself by systematically killing off the people onboard. The prospect of AI doing more harm than good may not be that farfetched. Experts suggest that weaponized AI could play a big part in future global conflicts, and the late physicist Stephen Hawking suggested that humanity might soon find AI to be the biggest threat to our survival.

Global Fintech Warning To Traditional Banks -- The Threat Is 'Real And Growing'

The UK fintech scene has been boosted by the local financial watchdog adopting EU so-called open banking rules early, forcing lenders to open up access for fintechs to the data and accounts of any clients who authorize it. Earlier this year the UK government created a crypto-assets task force, updated fintech regulation and built a UK-Australia so-called fintech bridge to help firms expand internationally. The rise in fintech firms and banking startups was sparked by the 2008 global financial crisis, which caused banks to cut back on spending and withdraw from some markets altogether — leaving a vacuum fintech companies stepped into. By using technology to make the finding, registering and lending to new customers quicker and easier these fintech companies have forced the traditional banking industry, which is famously slow to adapt, to react.

Confessions of a UX Designer

Some trends will stay in our profession. Those are usually theories based on sound foundational principles — the principles of our profession from decades of research and application. They’re like 501 Levis and a solid print t-shirt. They’ll never go out of style. But, most trends will fall by the wayside. Following them will often lead you and your project astray. As a general rule of thumb: Stay off the bandwagon, stray from the crowd and step to the beat of your own music — no matter how measured or far away. And as for those trending concepts that do stick around or seem to hold some validity — make sure you aren’t adopting them just because everyone else is. Don’t build a mobile app for a user base who would benefit more from a desktop application. Don’t adopt an idea based on an article you read citing a weak study or misrepresenting a study. In short, critically and strategically evaluate the concepts you add to your repertoire.

Embracing Conflict to Fuel Digital Innovation

When organizations try to determine the economic value of their data (EvD), there arises a nature conflict between 1) keeping all the data because of its potential monetization value versus 2) the potential storage and data management costs, not to mention potential fines and liabilities associated with data security and privacy breeches of that data, which highlights the following conflicts: Maximizing Value – Data assets have considerable potential economic or financial value they can add in terms of new revenue opportunities, process efficiencies, cost reductions, risk mitigation, etc. Monetizing these data sources the key to unlocking the potential in the big data era; and Minimizing Risk – Many organizations do not fully quantify the costs and risks associated with the corporate data. Denial of access to data such as we recently saw with the global wannacry cyberattack, is just one example of the risk inherent in underappreciating reliance on data. Data has both present and future value – and only once that value is fully understood can the risk be mitigated.

Agile Implementation from a Manager's Perspective

What is this strange role of an Agile Coach? On the board of the organization, a group of people who were to carry the torch of agile’s education appeared. They were to stimulate development of the agile development of software in teams, remove obstacles, talk to teams and their manager, and encourage them to think differently about the development of software and discourage the use of old methods. They started talking with my people, asking different questions over a cup of coffee or whatnot. They were sitting together with them in open spaces and observing what was happening. What were they talking about with my people? Or even worse, who were they informing and about what? They sneaked from one meeting to another as a mysterious Agent Smith. What was going on Why change something that works? I did not see the need to introduce Scrum, as my team was reaching the goals set for them without it. Better is an enemy of good ... do not touch it because you will get burnt.

AI, cybersecurity shape the CIO agenda for 2019 as IT budgets rise

Dynamism—the ability to embrace change and adopt technology—is the biggest predictor of digital transformation success, Mike Harris, executive vice president of research at Gartner, said during the opening keynote address at the Gartner Symposium/ITxpo. However, privacy is a top barrier to becoming dynamic. "If you don't successfully master privacy, your entire digital transformation is at risk," Harris said during the keynote. Businesses are increasingly scaling their digital efforts, the survey found: 33% of CIOs worldwide said they had evolved their digital endeavors to scale, up from 17% the year before. The major driver for scale is increasing consumer engagement through digital channels, the survey found. "The ability to support greater scale is being invested in and developed in three key areas: Volume, scope and agility. All aim at encouraging consumers to interact with the organization," Rowsell-Jones said in the release. 

Quote for the day:

"If you can't swallow your pride, you can't lead. Even the highest mountain had animals that step on it." - Jack Weatherford

Daily Tech Digest - October 20, 2018

Habits, it seems, get in the way of change despite our best intentions. “Habits are triggered without awareness — they are repeated actions that worked in the past in a given context or in a similar experience,” she notes. Wood’s research shows that concentrating on changing unwanted behaviors, and then creating new ones — not focusing on motivation — is the key to making change. She cites various efforts aimed at changing smoking habits in the U.S. from 1952 to 1999. Smoking decreased not when smokers were made aware of the health risks, but when buying and smoking cigarettes was made more difficult and less rewarding. Thus, higher taxes, smoking bans in public places, and limits on point-of-purchase ads — which add friction to smoking — were a more effective deterrent than warning labels on cigarette packages and public service advertising about smoking’s negative effects. A similar strategy of changing the context is possible in the workplace: Make old actions more difficult; make new, desired actions easier and more rewarding.

7 Ways A Collaboration System Could Wreck Your IT Security

Before an IT group blithely answers the call for a collaboration system – by which we mean groupware applications such as Slack, Microsoft Team, and Webex Team – it's important to consider the security risks these systems may bring. That's because the same traits that make these, and similar, applications so useful for team communications also make them vulnerable to a number of different security issues. From their flexibility for working with third-party applications, to the ease with which team members can sign in and share data, low transactional friction can easily translate to low barriers for hackers to clear. When selecting and deploying collaboration tools, an IT staff should be on the lookout for a number of first-line issues and be prepared to deal with them in system architecture, add-ons, or deployment. The key is to make sure that the benefits of collaboration outweigh the risks that can enter the enterprise alongside the software.

Apache Kafka: Ten Best Practices to Optimize Your Deployment

A running Apache ZooKeeper cluster is a key dependency for running Kafka. But when using ZooKeeper alongside Kafka, there are some important best practices to keep in mind. The number of ZooKeeper nodes should be maxed at five. One node is suitable for a dev environment, and three nodes are enough for most production Kafka clusters. While a large Kafka deployment may call for five ZooKeeper nodes to reduce latency, the load placed on nodes must be taken into consideration. With seven or more nodes synced and handling requests, the load becomes immense and performance might take a noticeable hit. Also note that recent versions of Kafka place a much lower load on Zookeeper than earlier versions, which used Zookeeper to store consumer offsets. Finally, as is true with Kafka’s hardware needs, provide ZooKeeper with the strongest network bandwidth possible. Using the best disks, storing logs separately, isolating the ZooKeeper process, and disabling swaps will also reduce latency.

The Evolution of Mobile Malware

Mobile malware isn’t just an opportunistic tactic for cybercriminals. Kaspersky Lab is also seeing its use as part of targeted, prolonged campaigns that can affect many victims. One of the most notable discoveries this year was Skygofree. It is one of the most advanced mobile implants that Kaspersky Lab has ever seen. It has been active since 2014, and was designed for targeted cyber-surveillance. It is spread through web pages, mimicking leading mobile network operators. This was high-end mobile malware that is very difficult to identify and block, and the developers behind Skygofree have clearly used this to their advantage: creating and evolving an implant that can spy extensively on targets without arousing suspicion. ... In recent times, rooting malware has been the biggest threat to Android users. These Trojans are difficult to detect, boast an array of capabilities, and have been very popular among cybercriminals. Once an attacker has root access, the door is open to do almost anything.

What is the CMO's Technology Strategy for 2019 and Beyond?

Two iPhones in someone's hand. One of the left says, "Technology is a given" on the screen, the one of the right says, "Not a debate" on the screen
Even the CMOs that don’t have the technological background are becoming more tech savvy. Integrate CMO Vaughan said he considers himself and his colleague marketers technology investors, trying to manage a portfolio of tech to provide efficiency, effectiveness and unique capabilities for the company. “We view technology as an enabler of our strategy and an important part of advancing our marketing capabilities,” Vaughan said. “We have tried to be very disciplined about not buying tech for tech sake, which is not always easy to do today with so many options. We start with the strategy, what we are trying to accomplish and build a roadmap, including ROI and an adoption plan and model for each technology we evaluate.” Vaughan said CMOs should know what is available and at their disposal to differentiate and accelerate their strategy. “This does not mean you have to be a technology expert,” he said.

Privacy, Data, and the Consumer: What US Thinks About Sharing Data

To prevent data being lost or stolen is the most obvious “table stake” for consumers. Just as important is the question of whether marketers should have it in the first place. This links clearly to the likes of GDPR in Europe where the bar has been raised for all organizations around justification of the data they hold. But if we have the right data, for the right reasons, if we keep it safe and if we can make it more transparent how we’re using that data to provide a more respectful, personalized, fairer and rewarding service to the consumer, the trust will grow. Equally, we need to trust the consumer, again by providing transparent access to the data we hold, clarity around how we use it and the ability for them to control their data. Overall, the research shows that while consumers are rightly concerned about data privacy, they are also aware that data is an essential part of today’s economy, with 57% on average, globally, agreeing or strongly agreeing. Factor in the neutrals and around two-thirds of consumers are accepting or neutral around data use in today’s data-driven, data-enabled world.

NHS standards framework aims to set the bar for quality and efficiency

Although most of the standards in the framework aren’t necessarily new, they are “intended to be a clear articulation of what matters the most in our standards agenda, and is accompanied by a renewed commitment to their implementation,” said NHS Digital CEO Sarah Wilkinson in the framework’s foreword. Speaking at the UK Health Show on 25 September, Wilkinson said the potential for use of data in the NHS is huge, but the health service needs to get to grips with standards to reap the benefits.  Most of the standards in the framework, which is currently in beta form and out for consultation, are based in international ones, however some are specialised for the NHS. This includes using the NHS number as a primary identifier – a standard which has been in place for a long time, but has had mixed results in uptake. The framework said the standard “is live now and should be adhered to in full immediately”. 

Open Banking has arrived, whether you like it or not

Australia has introduced Open Banking rules that will force the banks to share data with trusted Third-Party Providers (TPPs) by June 2019; Mexico has introduced a Fintech Law; South Korea and Singapore have enforced rules around financial data sharing between banks and third parties; and the USA has seen several banks innovating around open financial structures, although there is no law enforcing them to do this, yet. What intrigues me about the market movements is that some large financial players are taking a lead in this space, such as Citibank and Deutsche Bank’s open API markets, whilst some are resisting the change. I have heard several reports in the UK that the large banks have made data sharing incredibly difficult for the customer, by making the permissioning process very onerous and time-consuming. Equally, the implementation of European rules under PSD2 has seen several Fintech firms cry foul, as each bank creates its own interpretation, and therefore API interface, of the law.

How Data Changed the World

Running a city is always a challenging task. With Big Data, however, comes new opportunities alongside new challenges. Instead of having to rely on surveys and manually tracking how people move throughout an area, cities can instead rely on sensor-derived data, providing far greater resolution and a pool of data to draw from orders of magnitude larger than ever before available. Many of these advances may seem a bit mundane at first; developing improved traffic routes, for example, is unlikely to garner many headlines. However, these changes lead to concrete improvements, saving travelers time and improving overall quality of life. Furthermore, Big Data-derived improvements can inform city planners when deciding which direction their cities will take in the future. Before launching large and expensive projects, city managers will be able to look at information gleaned from Big Data to determine what the long-term effects will be, potential changing cities in fundamental ways.

Give REST a Rest with RSocket

An often-cited reason to use REST is that it’s easy to debug because its “human readable”. Not being easy to read is a tooling issue. JSON text is only human readable because there are tools that allow you to read it – otherwise it’s just bytes on a wire. Furthermore, half the time the data being sent around is either compressed or encrypted — both of which aren’t human readable. Besides, how much of this can a person “debug” by reading? If you have a service that averages a tiny 10 requests per second with a 1 kilobyte JSON that is the equivalent to 860 megabytes of data a day, or 250 copies of War and Peace every day. There is no one who can read that, so you’re just wasting money. Then, there is the case where you need to send binary data around, or you want to use a binary format instead of JSON. To do that, you must Base64 encode the data. This means that you essentially serialize the data twice — again, not an efficient way to use modern hardware.

Quote for the day:

"Managers maintain an efficient status quo while leaders attack the status quo to create something new." -- Orrin Woodward

Daily Tech Digest - October 19, 2018

McAfee researchers uncover ‘significant’ espionage campaign

The researchers believe the new version could only have been created by having access to the original source code, which has been modified to make the malware more able to avoid detection. This behaviour is in line with other nation state operations, which tend to recycle and evolve code, the researchers said. According to the research report, Oceansalt was launched in five attack “waves” adapted to its targets. The first and second waves were spear phishing-based and began with a malicious Korean-language Microsoft Excel document created in May 2018 that acted as a downloader for the implant. The Excel document contained information that led McAfee researchers to believe targets were linked to South Korean public infrastructure projects. In all malicious documents, embedded macros were used to contact a download server and wire the Oceansalt implant to disk. Once connected, the implant was designed to send the IP address and computer name of the targeted machine, as well as the file path of the implant.

Audits: The Missing Layer in Cybersecurity

When organizations are astute enough to turn to their audit teams for cybersecurity support, auditors must be prepared to deliver value, aligned to the speed of their business. Just as the businesses that auditors support are rapidly transforming, the audit groups must follow suit. This can be challenging, considering many IT auditors received much of their professional training many years ago, when the word cybersecurity did not command the attention it does today, and before transformative technologies such as artificial intelligence, connected Internet of Things devices, and cloud-based platforms were so prevalent and impactful. Here's the good news: There are many more educational and training resources available today than 20 years ago, when I began in IT audit. Despite time and budget constraints, it is incumbent upon auditors to pursue the appropriate training and credentialing to transform their organizations, refresh their skill sets, and obtain the auditing cybersecurity acumen needed to become integral to their organization's cyber programs.

Best new Windows 10 security features: More patching, updating flexibility

Microsoft Windows 10 logo bandage data map
The Windows Defender Security Center has been renamed to merely Windows Security Center to better identify that it’s the main location for security information. Ransomware protection first introduced in 1709 has been simplified to make it easier to add blocked applications to the interface. Click “Allow an app” through “Controlled folder access.” After the prompt, click the + button and choose “Recently blocked apps” to find the application that has been blocked by the protection. You can then build in an exclusion and add them to the allowed list. Because time syncing is so key to both authentication as well as being a requirement for obtaining updates, the Windows Time service is now monitored for being in sync with the proper time. Should the system sense that the time sync service is disabled, you will get a prompt to turn the service back on. A new security providers section exposes all the antivirus, firewall and web protection software that is running on your system. In 1809, Windows 10 requires antivirus to run as a protected process to register.

Cloud Covered – Are You Insured?

We all know we need insurance, but what is the right-coverage for me? Well, it really depends on what are the type of assets you are trying to protect and how your business would be impacted if something happened. If we think about our daily lives, imagine having 20 doors/windows wide open and then just locking or adding video-surveillance to the one in the backyard (because your neighbor just told you he had been robbed the night before and that the thief broke into his house through the backyard door). Well, that’s a good start, however there are still more than 19 doors & windows still wide open and vulnerable for anybody to access right? Well, that’s pretty much what happens in IT and only securing a few “doors” is called “black-listing”. Let me explain: every server has 65535 ports open (for TCP and the same amount for UDP). If we consider the black-listing approach, we may just close a few ports based on common vulnerabilities knowledge. Most of the times, we don’t know which ports our apps need to work on, therefore we need to follow this approach and just block a few ports while permitting the rest of them.

Is Venture Capital investment in AI Realistic or Out of Control?

There are a few reasons why this investment might be rational. Just as the Internet and mobile revolutions in the past decades fueled trillions of dollars of investment and productivity growth, AI-related technologies are promising the same benefits. So this is all rational, if AI is the true transformative technology that it promises to be, then all these investments will pay off as companies and individuals change their buying behaviors, business processes, and ways of interacting. No doubt AI is already creating so-called “unicorn” startups with over $1 Billion in valuation. This could be justified if the AI-markets are worth trillions. So, what is this money being used for? If you ask the founders of many of these AI companies what their gigantic rounds will be used for you’ll hear things like geographic expansion, hiring, and expansion of their offerings, products, and services. As we’ve written about before, the difficulty in finding skilled AI talent is pushing salaries and bonuses to ridiculous heights.

20 innovative data centers that give us a glimpse into the future of computing

It is predicted that by 2025 data centers will consume one fifth of the Earth's total power. From cooling to lights to servers, there's no question that data centers eat up a lot of power. Recent news that climate change may be happening faster--and more severely--than initially believed makes traditional data center design, and its massive consumption of power, something that needs to be addressed. ... Project Natick is a Microsoft research endeavor that puts shipping container-sized pods filled to the brim with servers on the bottom of the ocean. The one active test machine currently in operation is just off the coast of Scotland, where Microsoft plans to leave it for up to five years for study. Project Natick servers require zero human interaction and are designed to remain in place for more than five years without the need for maintenance or repair. These servers can be powered by 100% renewable resources and emit zero emissions. According to Microsoft, "no waste products, whether due to the power generation, computers, or human maintainers are emitted into the environment."

Weighing the pros and cons of data security outsourcing

It’s nearly impossible to run a successful business operation in the current marketplace without taking IT seriously. The problem is that very few small and medium-sized businesses have the knowledge or skillset needed to properly manage each individual aspect of IT in-house. This is especially true when it comes to something like data security. One of the keys to running a successful business is being honest with yourself and recognizing what you don’t know. By identifying the areas where you come up short, you can take steps to compensate and overcome so that your business can thrive. One way you do this is through working with knowledgeable individuals that specialize in the areas where you’re deficient. Data security is a specific area where businesses often lack the internal knowledge and expertise to excel. It’s a particularly challenging aspect of IT that business leaders don’t have the time to master internally, so they go outside the company and outsource.

Review: Artificial Intelligence in 2018

Artificial Intelligence is not a buzzword anymore. As of 2018, it is a well-developed branch of Big Data analytics with multiple applications and active projects. Here is a brief review of the topic. AI is the umbrella term for various approaches to big data analysis, like machine learning models and deep learning networks. We have recently demystified the terms of AI, ML and DL and the differences between them, so feel free to check this up. In short, AI algorithms are various data science mathematical models that help improve the outcome of the certain process or automate some routine task However, the technology has now matured enough to move these data science advancements from the pilot projects phase to the stage of production-ready deployment at scale. Below is the overview of various aspects of AI technology adoption across the IT industry in 2018. ... AI algorithms have mostly surpassed the stage of pilot projects and are currently on various stages of company-wide adoption.

Why CIOs need to find ways to manage the old and the new

Research from Henley Business School and McKinsey shows that to be agile, businesses are choosing not to re-engineer legacy systems, said Manwani: “They either add another interface or do something totally separate.” But this is not a sustainable approach to managing digitisation initiatives, he said. “You can’t keep doing this. Without the engagement of an enterprise architect, businesses will reduce their agility in the medium term.” Just as restructuring of the IT department will never happen on its own, Manwani said: “You should not do major transformation piecemeal.” The enterprise architect's role is to present a coherent plan that can be used as a blueprint to underpin a digital transformation initiative, he added. “When we teach practitioners in the architecture space, it takes some time for them to absorb that they can, and should, engage in strategy development,” he said. “Preparing an architecture target state linked to the strategy is essential. This often requires new capabilities and mindsets.”

Should robots have rights?

California recently passed Senate Bill 1001, which bars companies and people from using bots that intentionally mislead those they are talking to into thinking they are human.  Putting aside the social and legal merits of this law, Bill 1001 implicates a hitherto-abstract, philosophical debate about when a simulation of intelligence crosses the line into sentience and becomes a true AI. Depending upon where you draw the line, this law is either discrimination against another form of sentient life or a timely remedy intended to protect users from exploitation by malevolent actors using human speech simulators. Alan Turing — the father of artificial intelligence but better known to the public for his role in breaking the German naval codes during World War II — foresaw the implications of his theories, which are still foundational to computer science, and was the first to enter this debate. He proposed his eponymous Turing test for artificial intelligence in 1950.

Quote for the day:

"A leader is best when people barely know he exists, when his work is done, his aim fulfilled, they will say: we did it ourselves." -- Laotzu

Daily Tech Digest - October 18, 2018

How Financial Institutions Can Put Risk Management Back in the Driver’s Seat

The benefits of putting the business clearly in charge of risk management — and holding it accountable — are significant. First, holding the first line accountable for risk management aligns the interests of internal revenue generators with those of the overall firm. When first-line salespeople do their own risk generation “driving,” they gain an understanding of their firm’s position and reputation that they otherwise might not get. They are thus less likely to try to on-board a questionable client or put together loan proposals that may be rejected. In general, the business needs to be clearly accountable for managing the risks it takes in pursuit of its objectives. A system of first-line front-seat drivers also encourages people to keep their eye out for risks wherever they pop up, rather than relying on the oversight specialist — the backseat driver — to point them out. This improves performance by allowing the business to spot some risks sooner, manage them more nimbly, and react more quickly when things do go wrong.

Too many business executives view the analytics transformation too narrowly: They tend to view it as centering on tools or being driven by the hiring of "big data" analysts or machine learning expertise. Almost without exception, every successful analytics transformation that I've seen or experienced has started at the top - this transformation is as much, perhaps, even more, driven by a cultural change as it is by your hiring of new resources and technical expertise. Even today, however, too many business leaders and executives tend to be intimidated by analytics and math. My guidance to you (and you know who you are) is to get educated - fast. That does not mean that you have to get the equivalent of a Masters Degree in operations research or machine learning. It does mean seeking out an expert who can frame these capabilities in the language and the vernacular of a senior executive. This is the most perilous part of the journey because too many senior executives delegate this transformation to lower levels of the organization where it gets lost in a sea of other priorities.

The Future of the Cloud Depends on Magnetic Tape

Although the century-old technology has disappeared from most people’s daily view, magnetic tape lives on as the preferred medium for safely archiving critical cloud data in case, say, a software bug deletes thousands of Gmail messages, or a natural disaster wipes out some hard drives. The world’s electronic financial, health, and scientific records, collected on state-of-the-art cloud servers belonging to Amazon.com, Microsoft, Google, and others, are also typically recorded on tape around the same time they are created. Usually the companies keep one copy of each tape on-site, in a massive vault, and send a second copy to somebody like Iron Mountain. Unfortunately for the big tech companies, the number of tape manufacturers has shrunk over the past three years from six to just two—Sony Corp. and Fujifilm Holdings Corp.—and each seems to think that’s still one too many. The Japanese companies have said the tape business is a mere rounding error as far as they’re concerned, but each has spent millions of dollars arguing before the U.S. International Trade Commission to try to ban the other from importing tapes to America.

Solving the cloud infrastructure misconfiguration problem

The threats to cloud infrastructure are automated, so automated remediation is a requirement to effectively manage misconfiguration risk. His advice to CISOs is to set up a team that includes developers who understand cloud APIs and can automate every repetitive aspect of cloud security, starting with cloud configuration. “In order to be effective, the CISO needs to view their security team as an internal tool vendor in the cloud ecosystem. Development teams need support from security to move quickly, but also require good guard rails and feedback for how to do cloud securely,” he opines. “This security automation team led by the CISO needs to work closely with development teams to establish known-good configuration baselines using a whitelist approach that conforms with compliance and security policy. Once you have a known-good baseline, you can automate the remediation process for misconfiguration without running the risk of false positives leading to bad changes that can cause system downtime events.”

Quantum Computing: Why You Should Pay Attention

As databases continue to grow in size, this improvement can make it more feasible to handle the large volumes of data expected to come online in the coming years and decades as we reach physical limits in storage device latencies. Another practical advantage lies in our understanding of the world. Simulating quantum effects is notoriously difficult using the computers we rely on today, as the very fundamentals of quantum mechanics are vastly at odds with today’s devices. Using quantum computers, simulating these effects will be far simpler, allowing us to better unravel the mysteries of quantum mechanics. Even when quantum computing becomes common, it’s difficult to envision it completely replacing traditional computer devices. The types of applications at which quantum computers excel don’t seem to have much practical use for typical computer users. Furthermore, it will take some time for quantum computers to become smaller and more affordable, and there may be barriers preventing its widespread use.

Authentication Bypass in libSSH Leaves Servers Vulnerable

“Careful reading of code for the affected libSSH library indicated that it was possible to bypass authentication by presenting to the server an SSH2_MSG_USERAUTH_SUCCESS message in place of the SSH2_MSG_USERAUTH_REQUEST message which the server would expect to initiate authentication. The SSH2_MSG_USERAUTH_SUCCESS handler is intended only for communication from the server to the client,” an advisory by Peter Winter-Smith of NCC Group, who discovered the bug, says. In other words, an attacker can connect to any vulnerable server, without authentication, just by sending one message to the server. ... “Not all libSSH servers will necessarily be vulnerable to the authentication bypass; since the authentication bypass sets the internal libSSH state machine to authenticated without ever giving any registered authentication callbacks an opportunity to execute, servers developed using libSSH which maintain additional custom session state may fail to function correctly if a user is authenticated without this state being created,” the advisory says.

Learn Why Doctors Look To Data To Increase Patient Engagement

We’re positively swimming in data. But all that noise stands a good chance of confusing or distracting patients from their ultimate goal of ongoing good health if doctors and patients don’t come to the table together with a plan and a common understanding of which data points are meaningful in context and which are not.  There’s no doubt anymore: Big data is going to revolutionize the way we administer health care throughout the world and help us achieve financial savings. But as doctors look to leverage modern tools for interacting with and sharing patient health data, there are several factors to remember and several key advantages worth checking out. Here’s a rundown. Regrettably, we still lack a cure for many chronic diseases. Therefore, doctors and their patients must instead “manage” these conditions. It’s possible to live a full and active life while undergoing treatment for severe diseases and conditions, but only with the right levels of vigilance and engagement. Patients with chronic illnesses must maintain their motivation, their attention to treatment and medication schedules and their general knowledgeability about their condition.

Wärtsilä Opens World's First International Maritime Cyber Centre

“Cyber is such a critical topic to all players in marine. Taking stewardship in something as important as this, shows that Wärtsilä is committed to transform and digitalize the marine industry. This is the next step in our Smart Marine vision and supports our Oceanic Awakening and Sea20 initiatives,” says Marco Ryan, Chief Digital Officer at Wärtsilä. “There are three main drivers for the maritime industry to collaborate in improving our cyber resiliency: the vast attack surface that the maritime industry offers to cyber criminals; the inclusion of maritime into the critical national infrastructure of nation states and the pending cyber security regulation by the International Maritime Organisation in 2021,” says Mark Milford, Vice President, Cyber Security at Wärtsilä. The MCERT is an international cyber intelligence and incident support platform enhancing cyber resilience for the entire maritime ecosystem. It provides international intelligence feeds, advice and support, including real-time assistance to members on cyber attacks and incidents, and a Cyber Security Reporting Portal (CSRP) for its members.

Arm’s Neoverse will be the infrastructure for a trillion intelligent devices

Arm’s Neoverse intellectual property will take advantage of high-end manufacturing equipment in chip factories. The Ares platform will debut in 2019 with 30 percent per generation performance improvements, Henry said. The designs will be flexible for customer purposes and security will be a crucial part of the platform, Henry said. Ares will be built on seven-nanometer circuitry in the newest chip factories. Follow-up chips include the Zeus at seven nanometers and Poseidon at 5 nanometers. The Arm Neoverse will include advanced processor designs as well as solutions and support for hardware, software, tools, and services. The company announced the platform at the Arm TechCon event in San Jose, California. “Arm has been more successful in infrastructure than many knew. This makes sense as many networking and storage systems have Arm-based chips inside, albeit with smaller cores,” said Patrick Moorhead, analyst at Moor Insights & Strategy, in an email.

What Innovative CEOs and Leaders Need to Know about AI

The authors view AI as “performing tasks, not entire jobs.” Out of the 152 AI projects, 71 were in the automation of digital and physical tasks, 57 were using algorithms to identify patterns for business intelligence and analytics, and 24 were for engaging employees and customers through machine learning, intelligent agents, and chatbots. In the Harvard Business Review article, a 2017 Deloitte survey of 250 executives who were familiar with their companies’ AI initiatives, revealed that 51 percent responded that the primary goals were to improve existing products. 47 percent identified integrating AI with existing processes and systems as a major obstacle. ... Early adopters of AI in the enterprise are reporting benefits — 83 percent indicated their companies have already achieved “moderate (53 percent) or substantial (30 percent) economic benefits. 58 percent of respondents are using in-house resources versus outside expertise to implement AI, and 58 percent are using AI software from vendors.

Quote for the day:

"Leadership is a potent combination of strategy and character. But if you must be without one, be without the strategy." -- Norman Schwarzkopf

Daily Tech Digest - October 17, 2018

Microsoft Surface Pro 6
This time around, the major changes are inside: A bump up in the processor to an 8th-generation Core chip, some weird adjustments in pricing, and a new color— black—separate the new from the old. There's actually a downgrade of sorts in the GPU compared to the Surface Pro (2017), which is a bit of a disappointment. The Performance section of our review shows the clearest differences among the three generations. We've given the Surface Pro 6 what some would consider an "average" score of 3.5 stars, a lower score than we've given some other tablet PCs we've reviewed recently. But we're also giving it an Editor's Choice, like those other products. Despite being underwhelmed by the Surface Pro 6's failure to break new ground (or even add USB-C), we will give it this: It also has a nice, long 8.5 hours of battery life in our tests, which has been an Achilles heel with reviewed competition. It is still one of the best-designed Windows tablets you can buy, and its pricing is competitive with similarly configured products.

AI Common Sense Reasoning

To focus this new effort, MCS will pursue two approaches for developing and evaluating different machine common sense services. The first approach will create computational models that learn from experience and mimic the core domains of cognition as defined by developmental psychology. This includes the domains of objects (intuitive physics), places (spatial navigation), and agents (intentional actors). Researchers will seek to develop systems that think and learn as humans do in the very early stages of development, leveraging advances in the field of cognitive development to provide empirical and theoretical guidance. “During the first few years of life, humans acquire the fundamental building blocks of intelligence and common sense,” said Gunning. “Developmental psychologists have founds ways to map these cognitive capabilities across the developmental stages of a human’s early life, providing researchers with a set of targets and a strategy to mimic for developing a new foundation for machine common sense.”

Digital business projects in a quagmire? Hack your culture!

Gartner: Digital business projects in a quagmire? Hack your culture!
Changing mindsets is a key enabler of new technologies and one of the ways Gartner recommended that IT executives change the culture of their companies. “Hack your culture to change your culture,” said Kristin Moyer, research vice president and distinguished analyst at Gartner. “By culture hacking, we don’t mean finding a vulnerable point to break into a system. It’s about finding vulnerable points in your culture and turning them in to real change that sticks.” Hacking is about doing smaller actions that usually get overlooked Moyer said. Great hacks also trigger emotional responses, have immediate results and are visible to lots of people at once, she said. Gartner says culture is identified by 46 percent of CIOs as the largest barrier to getting the benefits of digital business. Achieving culture change is tied closely to another key direction organizations should strive to achieve – the ability to embrace change and adopt technology in a new way or what Gartner calls “dynamism.”

AI is fueling smarter collaboration

The first is improving the ability of individuals to access data. "Today, finding a document could be tedious [and] analyzing data may require writing a script or form," Lazar said. With AI, a user could perform a natural language query -- such as asking the Salesforce.com customer relationship management (CRM) platform to display third quarter projections and how they compare with the second quarter -- and generate a real-time report. Then, asking the platform to share this information with the user's team and get its feedback could launch a collaborative workspace, Lazar said. The second possible benefit is predictive. "The AI engine could anticipate needs or next steps, based on learning of past activities," Lazar said. "So if it knows that every Monday I have a staff call to review project tasks, it may have required information ready at my fingertips before the call. Perhaps it suggests things that I'll need to focus on, such as delays or anomalies from the past week."

Automation and employment debate takes a new turn

What gives machines -- and process automation -- the edge over humans? In addition to their ability to integrate data, machines, Levav noted, lack biases such as the illusion of validity, which leads people to overestimate their forecasting prowess. Yet, humans are still required in process automation, because only they can decide the important parameters, he added. "You will have a job because machines can't pick the variables that are relevant to a problem," he said. Scott Hartley, partner at venture capital firm Two Culture Capital, shared a similar view regarding the impact of AI on jobs. His take on AI-infused automation and employment takes a cue from Voltaire. Hartley's 2017 book, The Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World, cites a statement attributed to the 18th century philosopher to support his view that asking the right questions about data is central to acquiring knowledge. Making AI and machine learning work, Hartley said during a UiPath panel discussion, is "still fundamentally rooted in our ability to create diverse teams and ask questions from a multiplicity of angles."

Strengthening the CIO – Corporate Board Relationship

But on a positive note, more board members see how technology is unlocking new business models and spurring growth. They are convinced of the growing need to focus on speed, agility, innovation, and customer obsession, and see that it requires new approaches to business operations and to IT investment. Technology and cybersecurity have historically been seen as compliance issues under the purview of the board’s audit committee. However, given the increasing capability of technology to affect revenue and the business model, there is a greater recognition of its strategic importance. This has led to an increase in the number of CIOs and other technology experts being appointed to boards. Still, though, the majority of boards lack the technology prowess needed to successfully guide today’s digital era company.  What, then, can the CIO do to bridge the gap and develop a great relationship with the board?

Steel yourself for the cloud hangover

Steel yourself for the cloud hangover
We’re at what I call the hangover phase, where a night of cloud-hyped indulgence has led to many self-administered pats on the back, which obscured the reality that transitioning to the cloud is a harder than people originally thought. But the effort is still worth it. The budget overruns are no surprise, given that not much cost planning takes place during initial large cloud computing projects. Indeed, these initial projects fail to illustrate the true costs of using a public cloud, and if you look carefully you can see that the private clouds many such initial efforts focus on are just new cages of servers in data centers that cost more than the old cages of servers. Moreover, people costs are always higher than expected, and few enterprises plan to run both cloud and on-premises systems—but the reality is that you need to. What troubled me is that only 48 percent of the mid-sized businesses and only 36 percent of the large enterprises agree that cloud actually improved the business. I suspect that those who do not see the value have yet to complete a project’s successful journey to the cloud. But still, this figure should be higher.

Five steps for getting started in machine learning: Top data scientists share their tips

"If someone has programming fundamentals then, from a technical point of view, I think that's enough for them to dive into machine learning," he says. "You're not gonna get very far if you can't program at all, because that's ultimately how you configure the machine-learning frameworks is through programming. "I think strong math was probably more essential before than it is now. It's certainly helpful to have mathematical knowledge if you want to develop custom layers or if you're really going very, very deep on a problem. But for people starting out, it's not critical." In some respects, it's just as important to have a willingness to seek out new information, says Yangqing Jia, director of engineering at Facebook. "As long as you keep an exploratory mindset there's such an abundance of tools nowadays you'll be able to learn a lot of things yourself, and you have to learn things yourself because the field is growing really fast."

Researchers expose security vulnerabilities in terahertz data links

terahertz data links security
“In microwave communications, an eavesdropper can put an antenna just about anywhere in the broadcast cone and pick up the signal without interfering with the intended receiver,” Mittleman said. “Assuming that the attacker can decode that signal, they can then eavesdrop without being detected. But in terahertz networks, the narrow beams would mean that an eavesdropper would have to place the antenna between the transmitter and receiver. The thought was that there would be no way to do that without blocking some or all of the signal, which would make an eavesdropping attempt easily detectable by the intended receiver.” Mittleman and colleagues from Brown, Rice University and the University at Buffalo set out to test that notion. They set up a direct line-of-site terahertz data link between a transmitter and receiver, and experimented with devices capable of intercepting signal. They were able show several strategies that could steal signal without being detected — even when the data-carrying beam is very directional, with a cone angle of less than 2 degrees

The Three Dimensions of the Threat Intelligence Scale Problem

There is a massive amount of external TI that organizations can access to improve cyber defense. While cost can be a constraint for expensive commercial threat feeds, there is plenty of lower-cost and even free threat feeds available, from open source, government, and industry sources. While access to external TI is not an issue, the scale problem lies in managing, maintaining, and making effective use of TI. Some of these challenges include: Managing multiple threat feeds that come in different formats; Ensuring your threat feeds are constantly up to date; and Integrating TI into your security operations so that you can use it to improve security. The process of integrating TI into security operations is particularly interesting because it directly leads into another dimension of the network security TI scale problem. While organizations can turn to external TI to make up for the lack of access that a next-generation firewall provides, this same limitation hits you on the other side by hindering your ability to take action based on external TI. It's like a double firewall TI whammy!

Quote for the day:

“The only thing worse than training your employees and having them leave is not training them and having them stay.” -- Henry Ford