When Apple revoked Facebook's enterprise certificate, it subsequently broke all of Facebook's other internal employee apps. The side effect was likely unintentional, as just one certificate signed all of the apps. But it still had a far greater impact than any regulator could match for a privacy-related issue. The two companies, however, were working to restore Facebook's ability to use internal apps. Could Apple take this further and use its power in the mobile OS market to bring Facebook in line with evolving privacy wisdom? It's an idea floated in a column by Kevin Roose in the New York Times on Thursday. Apple could boot all of Facebook's apps - Instagram, WhatsApp included - literally with a few digital certificate revokations, a power no regulator has.There are all kinds of obvious problems with this, of course. A multi-billion dollar tech company shouldn't be taking up the slack for governments that are failing to protect consumers' privacy rights. Apple can take a strong privacy stance because it has little stake in the personal data trade. That could change, of course, depending on how Apple's business interests shift.
Organizations first need to define the right balance between stability and innovation: starting with an evaluation of their environment from a regulatory and compliance perspective in order to determine where the most critical audit risks lie. Because every sector is different, the requirements can go from securing the crown jewels (personal data, intellectual property, etc.) and responding to regulatory and governing bodies to ensuring the high availability of customer-facing applications. This risk assessment must then feed into an overall technology strategy meant to address the most critical issues, starting with basic IT controls and extending to the enhancement of monitoring and reporting. The organizations’ objective should be to enable an approach where traditional auditing of its systems (Audit of IT) is no longer the norm.... Once all of these IT controls have been addressed and strong, effective processes are in place, organizations can focus on innovating and making more intelligent use of technology, like advanced analytics or control automation, for audit and insights.
Within the past two years we’ve seen digital transformation dramatically impact businesses around the world. In the realm of enterprise automation, we saw artificial intelligence emerge as an impactful technology promising significant automation gains – in 2017 AI gained traction in the IT department, and during 2018 we saw its entry into boardrooms. In 2019, we will see AI combined with mobility and content understanding change entire workflows and processes within organizations, offices and factory floors. This requires not only transformation in IT strategy, but also in skills, recruitment and corporate attitudes. One of the most impactful technology trends we will see in 2019 is broad enterprise technology transition to self-service models. ... Corporate users are now eager to let robots take over. After panic of robots taking over their jobs have subsided, business users will initiate opportunities for digital workers to handle mundane, repetitive, manual tasks so they can focus on more cognitive and strategic work.
According to Gartner, the buyers’ ranking was significantly impacted by consolidation in the smartphone and PC markets. The big Chinese smartphone OEMs, in particular, have increased their market domination by taking out or purchasing competitors. This also meant that the top OEMs’ spending increased and their share reached 40.2% of the total semiconductor market in 2018, up from 39.4% in 2017. This trend is expected to continue, which will make it more difficult for semiconductor vendors to maintain high margins. ... Memory prices also impacted the market. Despite being high in the past two years, the Dynamic random-access memory (DRAM) average selling price (ASP) is now declining. However, the impact is limited, as OEMs will increase their memory content when the ASP drops and also invest in premium models. Gartner predicts that the share of total memory chip revenue in the total semiconductor market will be 33% in 2019 and 34% in 2020, higher than its 31% share in 2017.
The need to improve industry response to address cybersecurity risk is well established. Each new incident report increases awareness of the risks faced. The nature of the risks continues to evolve as new vulnerabilities and threats are discovered. Beginning with the Stuxnet attack in 2010 a steady stream of incidents have shown that industrial systems are vulnerable to both general and targeted attacks. While they may accept the need to protect their critical systems better, many asset owners struggle to understand what type of guidance information is available and how industry standards can help them formulate their response. The number and variety of available standards and related sources and complexity of the topic add to the confusion. An effective response to the threat must address all phases of the life cycle, from conception and selection through operations and support. Established standards reflect this need and provide requirements for all involved, from suppliers and integrators to asset owners and support providers.
Does this disruptive and revolutionary change in IT have an equivalent in the way that power is distributed and managed? Power is, after all, not so dissimilar to the flow of bits: it is a flow of electrons, which can be stored, generated and consumed, is distributed over a network, and is switchable and routable, and therefore manageable, from a distance. A follow-on question to this is: Even if power can be managed in this way, when is it safe, economic and useful to do so, given the many extra complexities and expenses involved in managing power? The answer to both questions is not quite as binary or as definitive as at the IT level, where the impact of software-defined has been immediate, significant and clearly disruptive. While the application of more intelligence and automation is clearly useful in almost all markets, the opportunities for innovation in power distribution are much less clear. Data centers are a stand-out example: Large facilities present a great opportunity for suppliers, because of the size and growth of the market, vast over-provisioning, high costs, and inflexibility in power distribution and use.
The Bangladesh Bank heist also highlighted operational weaknesses at the New York Fed, a 2016 investigation by Reuters found (see: Report: New York Fed Fumbled Cyber-Heist Response). Notably, attackers timed their attack to occur on the evening of Feb. 4, 2016, a Thursday, which was the day before the weekend begins in Bangladesh. They also used malware that suppressed printouts of concerned messages sent by New York Fed officials after they saw suspicious transactions. When Bangladesh Bank officials spotted the activity on Saturday, Feb. 6, they attempted to contact the Fed via email, sending a message that read: "Our system has been hacked. Please stop all payment (debit) instructions immediately," according to the report. But the New York Fed reportedly apparently didn't receive the message until the start of its workday on Monday morning, and it didn't inform Bangladesh Bank that it had alerted correspondent banks to the fraud until Monday evening, New York time.
It just seems that as the number of data sources grows, connectivity just becomes harder with the user having to know how to get at all the data he or she needs. Going back to IoT for a second, consider if you walk in to the office in the morning and there are 1000 more devices on the network than there were yesterday. What happens if you want access to the data on those new devices? Is the user expected to just know how to connect? Are we not just pushing more and more complexity onto the user? Can we not simplify it? Could it not be more dynamic? Or is it that we will continue to see software like BI tools lengthening the list of connectors to data sources, release by release? It just seems to me that we need to hide the connectors to all these data sources and connect to data at a higher level of abstraction. There are multiple ways of doing that. For example, you could use data virtualisation. Alternatively, we could look for some kind of advances in the data connectivity area itself – something that does not seem to have changed much in a couple of decades. If we look at the latter i.e. smarter connectivity, then what are we asking for?
We have seen organizations jump to acquire the latest cutting-edge technology in the hopes of capitalizing on data opportunities. But often the efforts are, again, myopic, driven by a single department, and not connected to the organization’s broader strategy. As a direct result of failing to coordinate with other parts of the business, organizations often will miss the opportunity to harness existing data to develop a deeper understanding of the customers and insights related to their business. In fact, the Katzenbach Center’s 2018 Global Culture Survey reveals that the biggest challenge to culture evolution is having certain areas of an organization more on board with an initiative than others. The solution is to instead bring in members of diverse teams throughout the organization to share the current state of how things actually work and make holistic decisions about how to move forward to best capitalize on data. This way, every department feels heard and is part of the process.
In the event of "no deal," there also won't be an "adequacy" agreement in place. Such an agreement would mean that the U.K.'s laws are seen by the EU as being good enough to comply with European law. "Companies and organizations operating within countries with adequacy agreements enjoy uninterrupted flow of personal data with the EU," Denham says. Unfortunately, no withdrawal agreement also means no adequacy agreement, at least not right away. "An assessment of adequacy can only take place once the U.K. has left the EU," she says. "These assessments and negotiations have usually taken many months," she says. "Until an adequacy decision is in place, businesses will need a specific legal transfer arrangement in place for transfers of personal data from the EEA to the U.K., such as standard contractual clauses." ... EU data protection authorities can impose fines of up to €20 million ($23 million) or 4 percent of an organization's annual global revenue - whichever is greater - on any organization found to have violated GDPR. Regulators can also revoke an organization's ability to process individuals' personal data.
Quote for the day:
"The quality of a leader is reflected in the standards they set for themselves." -- Ray Kroc