Daily Tech Digest - June 15, 2019

This is likely the No. 1 thing affecting your job performance


To improve your expertise, you must first identify gaps in your knowledge. You aren’t likely to be motivated to learn new things–nor can you be strategic about learning–if you’re not aware of what you do and don’t know. Without a good map of the existing state of your knowledge, you’ll bump into crucial new knowledge only by chance. ... The ability to know what you know and what you don’t know is called metacognition—that is, the process of thinking about your thinking. Your cognitive brain has a sophisticated ability to assess what you do and don’t know. You use several sources of information to make this judgment. Research by Roddy Roediger and Kathleen McDermott identified two significant sources of your judgments about whether you know something: memory and familiarity. If I ask you whether you’ve heard of Stephen Hawking, you start by trying to pull information about him from your memory. If you recall explicitly that he was a famous physicist or that he worked on black holes and had ALS, then you can judge that you’ve heard of him.



Fintech CEOs bullish on blockchain tech, give thumbs down on applications


While cryptocurrency received something of a reprieve, financial services executives this week expressed doubts about the current applications for blockchain and other distributed ledger technology. “There’s too much hype around blockchain,” said Rishi Khosla, CEO and co-founder of U.K.-based challenger bank OakNorth. “For the practicality of what’s actually been delivered so far, it is way underrated. I do believe that blockchain has a place in lending, especially when you think about sort of the whole ‘perfecting security process’. It just requires so much changing of the plumbing.” Still, some nodded favorably toward the technology’s potential impact on the industry. Securities and Exchange Commission commissioner Robert Jackson said blockchain technology can both shorten the time and lower the expense of clearing and settling trades. He also pointed to potential use cases for auditing, smart contracts and tracking and dealing with fraud.


Blockchain: A Boon for Cyber Security


Blockchain technology has impacted the cyber security industry in a few ways. The HYPR Corp is a New York based company that provides enterprises with decentralised authentication solutions, which enable consumers and employees to securely and seamlessly access mobile, Web and Internet of Things (IoT) applications. It uses blockchain technology to decentralise credentials and biometric data to facilitate risk based authentication. It invested US$ 10 million in 2018 on this platform. NuCypher is another blockchain security company which uses distributed blockchain systems proxy re-encryption. It also has an accessible control platform and uses public-key encryption to securely transfer data and enforce access requirements. Blockchain is one of the biggest tech buzzwords in the last few years, and the technology is being marketed as a cure for everything including cyber security. The US Ministry of Internal Affairs and Communications implemented a blockchain based system for processing government tenders in March 2018.



Sensory Overload: Filtering Out Cybersecurity's Noise

A good security process is extremely valuable. Regardless of the task at hand, process brings order to the chaos and minimizes the redundancy, inefficiency, and human error resulting from lack of process. On the other hand, a bad security process can have exactly the opposite effect. Processes should help and improve the security function. In order to do so, they need to be precise, accurate, and efficient. If they aren't, they should be improved by filtering out the noise and boiling them down to their essence. It's far too easy to get distracted by every new security fad that comes our way. Once in a while, an item du jour becomes something that needs to be on our radar. But most of the time, fads come and go and seldom improve our security posture. Worse, they can pull us away from the important activities that do. Many of us don't know exactly what logs and event data we will or will not need when crunch time comes. As a result, we collect everything we can get our hands on. We fill up our available storage, shortening retention and impeding performance, although we may never need 80% of what we're collecting.


The Next Big Privacy Hurdle? Teaching AI To Forget


The lack of debate on what data collection and analysis will mean for kids coming of age in an AI-driven world leaves us to imagine its implications for the future. Mistakes, accidents, teachable moments—this is how children learn in the physical world. But in the digital world, when every click, view, interaction, engagement, and purchase is recorded, collected, shared, and analyzed through the AI behemoth, can algorithms recognize a mistake and understand remorse? Or will bad behavior be compounded by algorithms that are nudging our every action and decision for their own purposes? What makes this even more serious is that the massive amount of data we’re feeding these algorithms has enabled them to make decisions experientially or intuitively like humans. This is a huge break from the past, in which computers would simply execute human-written instructions. Now, advanced AI systems can analyze the data they’ve internalized in order to arrive at a solution that humans may not even be able to understand—meaning that many AI systems have become “black boxes,” even to the developers who built them, and it may be impossible to reason about how an algorithm made or came to a certain decision.


How To Choose The Right Approach To Change Management

Our analysis shows that when you aggregate all the stages in the most popular OCM change models into a 10-stage process, none of them really cover all the bases. In fact, the analysis shows that it you choose one of these models you are likely to miss around 40 per cent of the steps suggested by other models. The analysis also shows that the biggest gap in popular change models is in ‘Assessing the Opportunity or Problem Motivating the Change’ – arguably the most critical step in OCM. ... So where do we turn when there is no real evidence to support popular change management models? Lewin did build an evidence base on a different approach to OCM. Rather than a planned approach to change, Lewin argues for a more emergent approach. He suggests that that groups or organisations are in a continual process of adaptation – there is no freezing or unfreezing. So, what are the critical success factors for creating an organisational culture that can purposefully adapt to changing environments whilst maintaining current operations?



In the drive to improve customer experience, Marketing needs to develop this single customer view, which will allow extremely targeted marketing. It does not help if copious social and historic shopping data is collated and used to build a customer persona if the customer's mobile number or email address was captured incorrectly. Likewise, duplicate records and "decayed" (out of date) data create annoyances both to the customer and to the marketing department. Much research has gone into why data is inaccurate, and the same answer is always found: it is due to human error. While human error can create the initial quality issue, for instance, when customer information is being loaded by one of the company's employees, benign neglect is also a contributor. Periodic reviews of whether customer contact details have changed are required, as well as scrupulous attention to returned emails and failed SMS messaging experienced during a marketing campaign. It is interesting to note that "Inadequate senior management support" is given as a challenge by 21% of the respondents.


How Do We Think About Transactions in (Cloud) Messaging Systems?

The baseline that we need to come from, is that everything's interconnected with everything else and users are going to expect to connect with their data and to collaborate with other users on any set of data in real time across the globe.  Messaging systems were introduced as a way of providing some element of reliable message passing over longer distances. Consider the scenario where you're transferring money from one account to another. There isn't the possibility, nor is there the desire, for any bank to lock records inside the databases of any other bank around the planet. So messaging was introduced as a temporary place that's not in your database or in my database. And then we can move the money around through these high highly reliable pipes. And each step of the journey can be a transaction: from my database to an outgoing queue, and from my outgoing queue to an intermediary queue, from one intermediary queue to another intermediary queue, from there to your incoming queue, and from your incoming queue to your database. As long each one of those steps was reliable and transactional, the whole process could be guaranteed to be safe from a business perspective.


Identity Is Not The New Cybersecurity Perimeter -- It's The Very Core

uncaptioned
It suggests that security perimeters are still effective in a cloud-native world -- and they most certainly are not. I often like to say, “If identity is the new perimeter, then Bob in accounting is the new Port 80.” In this new cloud-first world, all a hacker needs to do is get one person in an organization to click a link and it's game over. With the compromised employee’s credentials in hand, they can walk right through your defenses undetected and rob you blind. For true security in the cloud, identity needs to move to the very core of a company’s cybersecurity apparatus. That’s because when there is no more perimeter, only identity can serve as the primary control for security. As advocates of zero trust security (myself included) advise, “Don’t trust, verify.” How do you do it? Making the transition to a security model that places identity at the center involves a cultural shift that spans a company’s people, processes and technology. Here are key insights on how to get started, based on 15 years of experience helping companies turn the corner on identity-based security



Developing and Managing Change Strategies with Enterprise Architecture

The reality of most enterprises with IT portfolios consisting of > 100 IT applications is that a combination of each replacement option is technically feasible and, given the right approach, perhaps even cost-effective. And by using LeanIX, Enterprise Architects and their stakeholders can leverage collaborative mechanisms and live data to quickly evaluate technologies to see which mixture of SQL Server 2008/2008 R2 alternatives match specific business strategies and then govern the transformation projects thereafter. By linking Business Capabilities to applications, and linking those applications to technology components like SQL Server, Enterprise Architects can review Business Capability maps as seen within LeanIX Reports like the Application Matrix to align improvements with essential organizational processes. In particular, alongside a series of configurable views like “Technology Risk” and “Lifecycle”, an Application Matrix Report shows Business Capabilities and their supporting technologies across geographical user groups to help Enterprise Architects base decisions on overlapping business needs.



Quote for the day:


"A leadership disposition guides you to take the path of most resistance and turn it into the path of least resistance." -- Dov Seidman


No comments:

Post a Comment