Daily Tech Digest - April 13, 2021

19 Realistic Habits To Improve Software Development

When you finish writing a fragment of code and see that it works, take some time to reread it and see if you can improve it. Think that you are going to show it to someone else who is going to evaluate your code. Would you leave it the same? One of the best code refactoring techniques is the red/green process used in Agile test-driven development. To use this technique, your code must be covered with tests. If when refactoring, something fails, the test will not pass, and you will be aware that something is wrong with your refactor. ... Plan a time interval without distractions or interruptions. Interruptions will make your mind lose track of what it is developing, and you will have to start again when you resume the activity, which will cost you extra work time and make you more prone to make mistakes. It works to leave only the IDE open and a browser with a maximum of two tabs. ... Don’t try to write clever code that only you understand. Write code that someone else can read and understand. It doesn’t matter if your code has a few more lines if they’re necessary to make it understood better. Remember that in a few months, you or someone else on your team may have to modify the code, and if it is not easy to understand, it will not be easy to modify.


Clear & Present Danger: Data Hoarding Undermines Better Security

Even though there is overlap between the users of big companies' services and the customers of small businesses, the big companies aren't sharing their data. As a result, customers who use smaller businesses are left to fend for themselves. A few companies are trying to change that. Deduce (disclosure, another company I've consulted for) created a data collective through which companies can share information about user's security-related behavior and logins. In exchange for sharing data with the platform, companies get access to Deduce's repository of identity data from over 150,000 websites. They can use this shared data to better detect suspicious activity and alert their users, just like Microsoft and Google do using their own data. In a different approach to helping businesses identify suspicious users, LexisNexis created unique identifiers for their clients' customers. Using these identifiers, their clients can share trust scores that indicate if a particular user is suspicious. If a suspicious user attempts to log in to a website, the site can block that user to keep themselves and their legitimate users safer.


Optimizing the CIO and CFO Relationship“

CIOs are more likely to be pioneers and/or integrators, while CFOs are more likely to be guardians and drivers,” according to consultancy Deloitte in a description of different corporate personality types. “Pioneers are novelty-seeking, they like having a variety of possibilities, generating new ideas….On the other hand, the guardian personality values structure and loyalty, are much more methodical, detail-oriented, and perhaps a little more risk-averse.” ... CFOs understand that they have to change and expand their skills,” said Mastanuono. “The modern CFO understands technology and how it can transform the business. He or she also needs to understand the future of what finance will look like, and be a transformer of people, processes, and systems. The CFO must move from being a reactive to a proactive collaborator so the end business can be positioned to have the right systems and data at the right time. Breaking down silos and developing empathy and cross-functional collaboration are requirements, and the CFO-CIO relationship is a critical piece.” ... If CFOs and CIOs can develop a common approach to IT investments that looks at strategic risks as well as benefits, it creates common ground for project discussions and evaluations.


How to address post-pandemic infrastructure pain points

Managing workforce transformation is already challenging enough for employees who need to access on-premises resources. It becomes even more difficult if these employees work in regulated sectors, as medical and financial organizations need to track their employees’ identities, access requests, and usage to an even greater degree. Moreover, because there’s no one set of global standards, IT teams will need to account for many different compliance frameworks that vary based on where an employee is sitting, what information they’re accessing, and what sector they’re working in. On top of that, as businesses build new infrastructures that can accommodate and monitor permanently remote workers, they must be mindful of how certain regulations affect what personally identifiable information they can record about their own employees. GDPR, CCPA, and other privacy laws predate the pandemic, but like workforce transformation, they’ve become even starker and more commonplace challenges now. Different jurisdictions will have different mandates, and your IT teams will need to account for them all.


12 steps towards a secure project management framework

Cyber security is a tech-heavy domain, and project/program management is essential to deliver successful projects. However, cyber security requires a few tweaks in regular management practices as it comes with a different set of requirements. Cyber security is a security management program that is complex in nature and entails systematic processes. It deals with all aspects of a company’s operations, from mapping and recruiting skilled security professionals to vendor risk management. It involves protecting and securing computer systems, networks, and data from theft or damage, thereby ensuring business continuity. A project manager usually has to oversee many one-time and recurring cyber security tasks while handling usual responsibilities and priorities. A good project management framework will ensure that projects are delivered smoothly, without exceeding budgets, and are carried out in the timeframe decided. For any project management program to be successful, it’s important to define roles and responsibilities, a detailed plan of action, and milestones to be achieved.While most of the standard project management practices hold good in cyber security programs, there are a few cyber security-specific aspects that need to be taken care of with absolute diligence and strict adherence.


Information Relativity

Relativity was introduced at the beginning of the last century when Einstein proved that reality is fundamentally different depending on your frame of reference, a distortion of the spacetime continuum. The concept has led to the discovery of black holes, gravitational lenses, time dilation, and all kinds of other fantastic things. Relativity is not at all what one would expect based on our regular day-to-day lives that operate according to classic laws of physics. It changes what it means to observe and to be an observer—it means that how we experience the world differs not just in how we interpret it. There are circumstances where the world I experience is inconsistent with yours. It turns out that communication has these same circumstances that also work in this same peculiar way. Information is distorted depending on the location of the observer. Mark Burgess calls this “information relativity”: messages can take multiple paths and interfere with one another, information can be reversed in its order as it travels along one path, the speed of communication can be different from the speed of communication on another path. 


The Role of EiPaaS in Enterprise Architecture: Part 1

When discussing enterprise architecture, a diagram of the IT landscape comes to mind because that is the standard approach to defining an architecture. However, during our work with a number of enterprise architecture teams worldwide, we discovered that enterprise architecture has a larger strategic scope than what typical IT diagrams capture. Fundamentally, enterprise architecture converts business strategy into a value generation outcome by creating a foundation to execute various IT initiatives and processes. It is about gaining a long-term view for the organization, including the integration and standardization of various elements involved in the business. ... At the initial stages, an enterprise architecture will define the systems and subsystems required for each organization’s function. It starts with purchasing core systems, such as human resource management (HRM), customer relationship management (CRM) and/or enterprise resource planning (ERP) based on the business domain of the organization. In addition, subsystems will be built around the core systems by in-house or outsourced development teams. Systems and subsystems that belong to each function operate independently with limited or no information exchange.


Nvidia announces Morpheus, an AI-powered app framework for cybersecurity

Morpheus essentially enables compute nodes in networks to serve as cyberdefense sensors — Nvidia says its newly announced BlueField-3 data processing units can be specifically configured for this purpose. With Morpheus, organizations can analyze packets without information replication, leveraging real-time telemetry and policy enforcement, as well as data processing at the edge. Thanks to AI, Morpheus can ostensibly analyze more security data than conventional cybersecurity app frameworks without sacrificing cost or performance. Developers can create their own Morpheus skills using deep learning models, and Nvidia says “leading” hardware, software, and cybersecurity solutions providers are working to optimize and integrate datacenter security offerings with Morpheus, including Aria Cybersecurity Solutions, Cloudflare, F5, Fortinet, Guardicore Canonical, Red Hat, and VMware. Morpheus is also optimized to run on a number of Nvidia-certified systems from Atos, Dell, Gigabyte, H3C, HPE, Inspur, Lenovo, QCT, and Supermicro. Businesses are increasingly placing their faith in defensive AI like Morpheus to combat the growing number of cyberthreats.


Automation will accelerate decentralization and digital transformation

As the vaccinated population grows, doors reopen, and more people come together again, the reality we find ourselves in will not be the one left behind in 2019. Many long for a return to in-person experiences, but at the same time, have grown accustomed to the flexibilities of a decentralized, digital-first world. As we emerge from lockdown, hitting "rewind" will not satisfy customer and employee needs. Instead, companies must create hybrid experiences that integrate both digital and in-person modalities. In addition, the growing expectations of stakeholders has created unprecedented demand for IT innovation and greater sense of urgency in the post-pandemic world. Even as more offline activities resume, 2020's rapid digitalization will have a large and lasting impact on both customer and employee experiences. For example, analysis of global research from Salesforce shows customers anticipate engaging online with companies just as much in 2021 as they did in 2020. That customers expect to maintain this substantial departure from their 2019 patterns suggests that the swing to digital at the height of the pandemic wasn't purely due to unavailability of in-person channels.


How data poisoning attacks corrupt machine learning models

The main problem with data poisoning is that it's not easy to fix. Models are retrained with newly collected data at certain intervals, depending on their intended use and their owner's preference. Since poisoning usually happens over time, and over some number of training cycles, it can be hard to tell when prediction accuracy starts to shift. Reverting the poisoning effects would require a time-consuming historical analysis of inputs for the affected class to identify all the bad data samples and remove them. Then a version of the model from before the attack started would need to be retrained. When dealing with large quantities of data and a large number of attacks, however, retraining in such a way is simply not feasible and the models never get fixed, according to F-Secure's Patel. "There's this whole notion in academia right now that I think is really cool and not yet practical, but we'll get there, that's called machine unlearning," Hyrum Anderson, principal architect for Trustworthy Machine Learning at Microsoft, tells CSO. "For GPT-3 [a language prediction model developed by OpenAI], the cost was $16 million or something to train the model once.



Quote for the day:

"It's not about how smart you are--it's about capturing minds." -- Richie Norton

Daily Tech Digest - April 12, 2021

Coding interviews are terrible. Can we make them better?

A typical coding interview will involve presenting a candidate with a technical problem, which they'll have to solve in real time and in front of the interviewing panel. While these typically vary from one company to another, one common format is whiteboard coding, whereby a candidate might be asked to provide a solution to a problem involving a binary tree. It was a binary tree task that drew the ire of Howell in his now-famous tweet. These are a fairly typical part of technical interviews, designed to assess a candidate's ability to solve a programming problem and show their thinking 'out loud'. Still, most programmers say this isn't representative of anything they'd have to do in their day-to-day job, and say it's an outdated means of assessing candidates that doesn't reflect their skill level. "These little challenges don't show the greater skill sets, which for me are the ability to construct large programs," says Howell. "It's not about small algorithms. It's about the design of larger systems, and that's way more important." Howell also sees traditional coding interviews as being reflective of an industry that focuses too much on building at speed. "It's partly because the software industry moves so fast," he says.


How Augmented Reality Strengthens Biotech Manufacturing

Factories where engineers or scientists are using smart glasses to obtain virtual guidance, operators working with remote vendors to detect equipment failures in real-time, or interactive training sessions planned by directors located in another continent, are already here. “The barriers to adoption are decreasing as the AR industry becomes more robust,” notes Stracquatanio. Probably, the biggest advantage of AR is it enables seeing the production process virtually, without the need to be there. “It’s a game-changer for the industry. Individuals can have eyes and ears on site at a moment’s notice to address an emerging issue, or to host routine remote collaboration sessions,” Stracquatanio highlights. AR can also increase control over the manufacturing process. Pharma and biotech companies cannot afford mistakes during the production phase. A little oversight might lead to serious consequences such as having to start from scratch, which can be very expensive and time-consuming. A recent example is that of Johnson & Johnson’s manufacturing partner Emergent BioSolutions, whose workers erroneously mixed ingredients from two different Covid-19 vaccines; this led to wasting around 15 million vaccine doses.


Fileless Malware, Endpoint Attacks on the Rise

Cybercriminals are increasingly leveraging fileless malware, cryptominers and encrypted attacks, targeting users both at remote locations as well as corporate assets behind the traditional network perimeter. These were among the findings of WatchGuard Technologies’ Internet Security Report for Q4 2020, which found fileless malware and cryptominer attack rates grew by nearly 900% and 25%, respectively, while unique ransomware payloads plummeted by 48% in 2020 compared to 2019. The report also found botnet malware targeting IoT devices and routers became a top strain, among them the Linux.Generic virus (also known as “The Moon”), malware which is part of a network of servers that directly targets IoT devices and consumer-grade network devices, like routers, to exploit any open vulnerabilities. Total network attack detections grew by 5% in Q4, reaching their highest level in more than two years, while total unique network attack signatures showed steady growth as well, with a 4% increase compared with the third quarter of 2020. “We believe the increase in endpoint attacks between 2019 and 2020 is largely due to the widespread rise of remote work in response to the global pandemic,” Corey Nachreiner, WatchGuard CTO, explained.


Could social media networks pave the way towards stronger authentication?

Passwords are still the most common form of user authentication, “protecting” accounts, devices and systems, but alone, they don’t provide strong security. Not only that, they don’t offer the best user experience. Many passwords don’t even meet the minimum criteria of being unique and complex. People reuse passwords across accounts because they simply can’t keep track of all the logins they have. They choose passwords that are easy to remember to ease the burden, but that makes them easy to guess too. In fact, our research shows that people reuse their passwords across an average of ten personal accounts, while ‘123456’ still topped the list for the most common password in 2020. Even when they have chosen well, their unique and complex password can still fall victim to a modern phishing attack. After all, even an exemplary password can’t protect an account if the holder has been tricked into providing the information. From a user experience perspective, you have the stress and strain of choosing a unique, complex password each time that also meets the criteria demanded by the platform or service provider.


Nation-state cyber attacks double in three years

“Cyber crime economies are shaping the character of nation-state conflicts,” said McGuire. “There is also a ‘second generation’ of cyber weaponry in development that draws upon enhanced capabilities in computing power, AI [artificial intelligence] and cyber/physical integrations. One such example is ‘Boomerang’ malware, which is ‘captured’ malware that can be turned inward to operate against its owners. “Nation states are also developing weaponised chatbots to deliver more persuasive phishing messages, react to new events and send messages via social media sites. In the future, we can also expect to see the use of deepfakes on the digital battlefield, drone swarms capable of disrupting communications or engaging in surveillance, and quantum computing devices with the ability to break almost any encrypted system.” To ease rising tensions and prevent nation states from being drawn into more hostile cyber attacks, 70% of the expert panel said they thought some kind of international treaty would ultimately be necessary – this is by no means a new idea – but just 15% of them thought a cyber convention would be agreed on this decade, 37% said it was more likely to come in the 2030s, and 30% said it would probably never happen.


Quantum computer based on shuttling ions is built by Honeywell

Trapped-ion qubits were used to implement the first quantum logic gates in 1995, and the proposal for a quantum charged coupled device (QCCD) – a type of quantum computer with actions controlled by shuffling the ions around – was first made in 2002 by researchers led by David Wineland of the US National Institute of Standards and Technology, who went on to win the 2012 Nobel Prize for Physics for his work. Quantum gates have subsequently been demonstrated in multiple platforms, from Rydberg atoms to defects in diamond. The quantum computing technology first used by IT giants, however, was solid state qubits. In these, the qubits are superconducting circuits, which can be mounted directly on to a chip. These rapidly surpassed the benchmarks set by trapped ions, and are used in record-breaking machines from IBM and Google: “Working with trapped ions, I would be asked by people, ‘Why aren’t you working with superconducting qubits? Isn’t that race pretty much already settled?’,” says Winfried Hensinger of the UK’s University of Sussex. Recently, however, the progress made using superconducting circuits appears to be slowing as quantum computers integrate more and more qubits.


How MPC can solve blockchain’s data privacy conundrum

MPC, or multi-party computation, solves for confidentiality by utilizing a network of computation nodes that compute directly on encrypted data while maintaining zero knowledge about the data. For example, an employer may want to find out the average age of each of their employees. For privacy reasons, these employees may not be willing to share their ages, so through secret sharing, the employees can share their age without their age being publicly identifiable to them. The possibilities this technology enables are endless, and one must only think of the benefits such technology could bring to industries such as banking and insurance. While MPC solves for privacy, blockchain itself can protect the individual data against data breaches via the decentralization of sensitive information. Alone, blockchain lacks the infrastructure required to ensure data remains private. ... Not only is the pairing of MPC technology and blockchain a better solution to safeguarding consumer data to those currently in existence, it is one of the most viable solutions that effectively deals with the monumental problem of data security.


How Do Large Firms Train ML Models At Scale?

GPipe is a distributed machine learning library that uses synchronous stochastic gradient descent apart from pipeline parallelism to train any DNN containing multiple sequential layers. GPipe partitions a model across various accelerators and spins small batches of training examples to even smaller batches. Hence, GPipe’s accelerators can operate parallelly and maximise the scalability of the training process. It allows easy deployment of more accelerators to train large models and further scale the performance without tuning hyperparameters. GPipe is a distributed machine learning library that uses synchronous stochastic gradient descent apart from pipeline parallelism to train any DNN containing multiple sequential layers. GPipe partitions a model across various accelerators and spins small batches of training examples to even smaller batches. Hence, GPipe’s accelerators can operate parallelly and maximise the scalability of the training process. It allows easy deployment of more accelerators to train large models and further scale the performance without tuning hyperparameters.


Data validates future of work looks quite different than pre-pandemic

Both private and professional lives are slowly readopting former practices, such as eating inside a restaurant. As we cautiously return to normal, road warriors are ready to get back on the road, but we're also excited to keep some of the improved healthcare, restaurant and retail experiences we've discovered over the last year. Respondents cited the top four things they said they missed while working remotely: Spontaneous interactions with colleagues I wouldn't have talked to otherwise; Simply being around other people; Exposure to a diversity of perspectives and ideas; and Productivity. Qualtrics discovered that respondents found improved productivity (51%) and well-being—two times more likely than those who say it declined—during the pandemic lockdown. Managers concur: 55% said their direct reports have been more productive. Generationally, 54% of millennials said they're more productive, 53% of Gen Z, 48% of Gen X and 34% of boomers agree. Productivity has improved due to flexible schedules (31%), no commute (26%), more control over workspace (24%), ability to focus with fewer work interruptions (24%) and more privacy and personal space (23%).


The benefits of cyber threat intelligence

All of this saves time and helps them be more effective at mitigating threats and reducing risks. CTI allows the SOC to see beyond the perimeter, so they are aware of threats before they hit their infrastructure. That allows the SOC time to prepare, tweak defenses, such as deploying specific monitoring rules or knowing what to be on the lookout for. And when dealing with incidents or alerts, having this additional context allows them to place the individual alert, or maybe alerts they are dealing with, in the wider context of who is behind it, what their aims are, while typical next steps would be, or maybe even what must have gone before for this to occur. All of that makes it easier to determine how to respond. And when dealing with multiple alerts or incidents, as SOCs do, having this context allows you to prioritize, separating the wheat from the chaff as it were. And that’s critical as many SOCs are resource strained, and so knowing which items to focus on can help with making the most effective use of limited resources.



Quote for the day:

"It's good to trust others but, not to do so is much better." -- Benito Mussolini

Daily Tech Digest - April 11, 2021

One-stop machine learning platform turns health care data into insights

To turn reams of data into useful predictions, Cardea walks users through a pipeline, with choices and safeguards at each step. They are first greeted by a data assembler, which ingests the information they provide. Cardea is built to work with Fast Healthcare Interoperability Resources (FHIR), the current industry standard for electronic health care records. Hospitals vary in exactly how they use FHIR, so Cardea has been built to "adapt to different conditions and different datasets seamlessly," says Veeramachaneni. If there are discrepancies within the data, Cardea's data auditor points them out, so that they can be fixed or dismissed. Next, Cardea asks the user what they want to find out. Perhaps they would like to estimate how long a patient might stay in the hospital. Even seemingly small questions like this one are crucial when it comes to day-to-day hospital operations — especially now, as health care facilities manage their resources during the Covid-19 pandemic, says Alnegheimish. Users can choose between different models, and the software system then uses the dataset and models to learn patterns from previous patients, and to predict what could happen in this case, helping stakeholders plan ahead.


8 Ways Digital Banking Will Evolve Over the Next 5 Years

The initial shift toward digital financial services saw an ad hoc response from regulators. As new technologies come into play and tech giants like Google and Apple become increasingly disruptive in the financial industry, these transformations will force policymakers to identify emerging threat vectors and comprehensively address risk. In contrast to today’s mostly national systems of oversight, a global approach may be necessary to ensure stability in the sector, and we may see the rise of new licensing and supervisory bodies. The future of digital banking appears bright, but the unprecedented pace of innovation and shifts in consumer expectations demand a new level of agility and forward-thinking. Even as financial institutions attempt to differentiate themselves from competitors, co-innovation will become an integral part of success. People and technology will both play critical roles in these developments. Tech capabilities and digital services must be extremely resilient, constantly available at the time of customer need. Human capital, however, will be as crucial as any other asset. Leaders will have to know how to upskill, reskill and retain their talent to promote innovation. 


A new era of innovation: Moore’s Law is not dead and AI is ready to explode

We sometimes use artificial intelligence and machine intelligence interchangeably. This notion comes from our collaborations with author David Moschella. Interestingly, in his book “Seeing Digital,” Moschella says “there’s nothing artificial” about this: There’s nothing artificial about machine intelligence just like there’s nothing artificial about the strength of a tractor. It’s a nuance, but precise language can often bring clarity. We hear a lot about machine learning and deep learning and think of them as subsets of AI. Machine learning applies algorithms and code to data to get “smarter” – make better models, for example, that can lead to augmented intelligence and better decisions by humans, or machines. These models improve as they get more data and iterate over time. Deep learning is a more advanced type of machine learning that uses more complex math. The right side of the chart above shows the two broad elements of AI. The point we want to make here is that much of the activity in AI today is focused on building and training models. And this is mostly happening in the cloud. But we think AI inference will bring the most exciting innovations in the coming years.


Rethinking Ecommerce as Commerce at Home

Ecommerce is all grown up. It’s time to break away from the early-internet paradigm where online shopping was a new, “electronic” form of shopping. Today, almost all commerce involves varying degrees of digital elements (discovery, price comparison, personalization, selection, ordering, payment, delivery, etc.). The defining factor is not whether commerce is digital; rather, one defining factor is the optimal location for a retailer to meet a consumer’s needs. Shopping happens on a spectrum between home and the store. As such, ecommerce is better understood as commerce at home, and Amazon was the early winner. Great retailers focus on convenience or the experiential. In the new paradigm, certain retail truths persist. For example, all great retailers have focused primarily on either convenience retail or experiential retail. To be clear, any retail can be a great experience, but the priority matters. Amazon focuses ruthlessly on convenience. The outcome is a great customer experience. To drive growth, Amazon has prioritized speed and selection over consultation and curation. Amazon’s focus on convenience has yielded an (incredibly) high-volume, low-margin retail business.


These are the AI risks we should be focusing on

AI may never reach the nightmare sci-fi scenarios of Skynet or the Terminator, but that doesn’t mean we can shy away from facing the real social risks today’s AI poses. By working with stakeholder groups, researchers and industry leaders can establish procedures for identifying and mitigating potential risks without overly hampering innovation. After all, AI itself is neither inherently good nor bad. There are many real potential benefits that it can unlock for society — we just need to be thoughtful and responsible in how we develop and deploy it. For example, we should strive for greater diversity within the data science and AI professions, including taking steps to consult with domain experts from relevant fields like social science and economics when developing certain technologies. The potential risks of AI extend beyond the purely technical; so too must the efforts to mitigate those risks. We must also collaborate to establish norms and shared practices around AI like GPT-3 and deepfake models, such as standardized impact assessments or external review periods.


India Inc. must consider Digital Ethics framework for responsible digitalisation

An accelerated pace of digital transition, consumption of goods and services via app-based interface, and proliferation of data bring numerous risks such as biased decision-making processes being transferred to machines or algorithms at the development stage by humans, a Deloitte statement said on Friday. "These biases can be a threat to the reputation and trust towards stakeholders, as well as cause operational risks," it said. Partner, Deloitte India, Vishal Jain, said the pandemic compelled businesses and consumers to embrace digital technologies like artificial intelligence, big data, cloud, IoT and more in a big way. "However, the need of the hour is to relook at the business operations layered on digital touchpoints with the lens of ethics, given biases might arise in the due course, owing to a faster response time to an issue," he said. Societal pressure to do "the right thing" now needs a careful consideration of the trade-offs involved in the responsible usage of technology, Jain said, adding, its interplay becomes vital to managing data privacy rights while actively adopting customer analytics for personalised service.


How to Be a Better Leader By Building a Better Tribe

All of our journeys are exquisitely different, yet come with a unique set of challenges that can blur our leadership lens if not properly focused. This can become a snowball of personal detriment. Therefore, your mental, physical, and emotional health is just as important (if not more) than your professional and economic health—they are interrelated. Identify a therapist, wellness clinician, spiritual leader, life coach, physical trainer and/or anyone who can support your becoming an even greater version of yourself. Let's call this person the "healer". Make time for physical activity, healthy food choices and spending time with loved ones. Ensure the same investment you make in your team members, you also make in yourself. It is up to you to create your rituals for personal success. What will they entail? ... Similarly to curating a list of your tribal elders, remember that you are also an elder to a younger leader in your collective. We all were afforded a different set of societal privileges based on constructs of race/ethnicity, gender, sexual orientation, cognitive and physical abilities, etc. I think it’s important to utilize some of these privileges to be an ally/co-conspirator to someone who may not have the same position in society.


What is an enterprise architect? Everything you need to know about the role

The role of EA is closely connected to solutions architect, but tends to be broader in outlook. While EAs focus on the enterprise-level design of the entire IT environment, solution architects find spot solutions to specific business problems. EAs also work closely with business analysts, who analyse organisational processes, think about how technology might help, and then make sure tech requirements are implemented successfully. Looking upwards, EAs tend to work very closely with chief information officers (CIOs). While the CIO focuses on understanding the wider business strategy, the EA works to ensure that the technology that the organisation buys will help it to meet its business goals, whether that's improvements in productivity, gains in operational efficiency or developing fresh customer experiences, while also working with others – like the security team – to ensure everything remains secure. Nationwide CIO Gary Delooze is a former EA who says a really good enterprise architect will bring the business and IT teams together to create a technology roadmap.


How Blockchain Can Simplify Partnerships

To appreciate the ways in which blockchains can support complex collaborations, consider the task of shipping perishable goods across borders — a feat that requires effective coordination among suppliers, buyers, carriers, customs, and inspectors, among others. When the parties pass the cargo to another, a flood of information is transferred with it. Each party keeps their own record and tends to communicate with one partner at a time, which often leads to inconsistent knowledge across participants, shipping delays, and even counterfeit documentations or products. If, say, the buyer expects the goods to be constantly cooled throughout the shipping process and temperatures exceed agreed thresholds, a dispute is likely to occur among the buyer, the supplier, and the carrier, which can devolve into lengthy wrangling. The carrier may haggle over the liability to lower the compensation, arguing that customs delaying the transportation or the inspectors who improperly operated with the cargo are the ones to blame. The buyer will ask the supplier for remedy, who in turn needs to negotiate with the carrier. And so on. Problems like these can manifest in any collaboration that requires cumbersome information sharing among partners and may involve disputes in the process. 


Practical Points from the DGPO: An Introduction to Information Risk Management

Individuals are starting to pay attention to organizational vulnerabilities that compound risks associated with managing, protecting, and enabling access to information, ranging from poor data quality, insufficient methods of protecting against data breaches, inability to auditably demonstrate compliance with numerous laws and regulations, in addition to customer concerns about ethical and responsible corporate use of personal data. And as organizations expand their data management footprints across an increasingly complex hybrid multicloud environments, there has never been a greater need for systemic information risk management. ... In general, “risk” affects the way that a business operates in a number of ways. At the most fundamental level, it inhibits quality excellence. However, exposure to risks not only has an effect on project objectives, but it also poses threats of quantifiable damage, injury, loss, liability, or other negative occurrence that may be avoided through preemptive action. Using the Wikipedia definition as a start, we can define information risk as “the potential for loss of value due to issues associated with managing information.”



Quote for the day:

"The actions of a responsible executive are contagious." -- Joe D. Batton