Daily Tech Digest - November 25, 2022

Ripe For Disruption: Artificial Intelligence Advances Deeper Into Healthcare

The challenges and changes needed to advance AI go well beyond technology considerations. “With data and AI entering in healthcare, we are dealing with an in-depth cultural change, that will not happen overnight,” according to Pierron-Perlès at her co-authors. “Many organizations are developing their own acculturation initiatives to develop the data and AI literacy of their resources in formats that are appealing. AI goes far beyond technical considerations.” There has been great concern about too much AI de-humanizing healthcare. But, once carefully considered and planned, may prove to augment human care. “People, including providers, imagine AI will be cold and calculating without consideration for patients,” says Garg. “Actually, AI-powered automation for healthcare operations frees clinicians and others from the menial, manual tasks that prevent them from focusing all their attention on patient care. While other AI-based products can predict events, the most impactful are incorporated into workflows in order to resolve issues and drive action by frontline users.”


Extinguishing IT Team Burnout Through Mindfulness and Unstructured Time

Mindfulness is fundamentally about awareness. For it to grow, begin by observing your mental state of mind, especially when you find yourself in a stressful situation. Instead of fighting emotions, observe your mental state as those negative ones arise. Think about how you’d conduct a deep root cause analysis on an incident and apply that same rigor to yourself. The key to mindfulness is paying attention to your reaction to events without judgment. This can unlock a new way of thinking because it accepts your reaction, while still enabling you to do what is required for the job. This contrasts being stuck behind frustration or avoiding new work as it rolls in. ... Mindfulness is an individual pursuit, while creativity is an enterprise pursuit, and providing space for employees to be creative is another key to preventing burnout. But there are other benefits as well. There is a direct correlation between creativity and productivity. Teams that spend all their time working on specific processes and problems struggle to develop creative solutions that could move a company forward. 


Overcoming the Four Biggest Barriers to Machine Learning Adoption

The first hurdles with adopting AI and ML are experienced by certain businesses even before they begin. Machine learning is a vast field that pervades most aspects of AI. It paves the way for a wide range of potential applications, from advanced data analytics and computer vision to Natural Language Processing (NLP) and Intelligent Process Automation (IPA). A general rule of thumb for selecting a suitable ML use case is to “follow the money” in addition to the usual recommendations on framing the business goals – what companies expect Machine Learning to do for their business, like improving products or services, improving operational efficiency, and mitigating risk. ... The biggest obstacle to deploying AI-related technologies is corporate culture. Top management is often reluctant to take investment risks, and employees worry about losing their jobs. Businesses must start with small-scale ML use cases that demand realistic investments to achieve quick wins and persuade executives in order to assure stakeholder and employee buy-in. By providing workshops, corporate training, and other incentives, they can promote innovation and digital literacy.


Fixing Metadata’s Bad Definition

A bad definition has practical implications. It makes misunderstandings much more likely, which can infect important processes such as data governance and data modeling. Thinking about this became an annoying itch that I couldn’t scratch. What follows is my thought process working toward a better understanding of metadata and its role in today’s data landscape. The problem starts with language. Our lexicon hasn’t kept up with modern data’s complexity and nuance. There are three main issues with our current discourse about metadata: Vague language - We talk about data in terms of “data” or “metadata”. But one category encompasses the other, which makes it very difficult to differentiate between them. These broad, self-referencing terms leave the door open to being interpreted differently by different people. A gap in data taxonomy - We don’t have a name for the category of data that metadata describes, which creates a gap at the top of our data taxonomy. We need to fill it with a name for the data that metadata refers to. Metadata is contextual - The same data set can be both metadata and not metadata depending on the context. So we need to treat metadata as a role that data can play rather than a fixed category.


Addressing Privacy Challenges in Retail Media Networks

The top reason that consumers cite for mistrusting how companies handle their data is a lack of transparency. Customers know at this point that companies are collecting their data. And many of these customers won’t mind that you’re doing it, as long as you’re upfront about your intentions and give them a clear choice about whether they consent to have their data collected and shared. What’s more, recent privacy laws have increased the need for companies to shore up data security or face the consequences. In the European Union, there’s the General Data Protection Regulation (GDPR). In the U.S., laws vary by state, but California currently has the most restrictive policies thanks to the California Consumer Protection Act (CCPA). Companies that have run afoul of these laws have incurred fines as big as $800 million. Clearly, online retailers that already have — or are considering implementing — a retail media network should take notice and reduce their reliance on third-party data sources that may cause trouble from a compliance standpoint.


For Gaming Companies, Cybersecurity Has Become a Major Value Proposition

Like any other vertical industry, games companies are tasked with protecting their organizations from all nature of cybersecurity threats to their business. Many of them are large enterprises with the same concerns for the protection of internal systems, financial platforms, and employee endpoints as any other firm. "Gaming companies have the same responsibility as any other organization to protect customer privacy and preserve shareholder value. While not specifically regulated like hospitals or critical infrastructure, they must comply with laws like GDPR and CaCPA," explains Craig Burland, CISO for Inversion6, a managed security service provider and fractional CISO firm. "Threats to gaming companies also follow similar trends seen in other segments of the economy — intellectual property (IP) theft, credential theft, and ransomware." IP issues are heightened for these firms, like many in the broader entertainment category, as content leaks for highly anticipated new games or updates can give a brand a black eye at best, and at worst hit them more directly in the financials. 


Driving value from data lake and warehouse modernisation

To achieve this, Data Lakes and Data Warehouses need to grow alongside the business requirements in order to be kept efficient and up to date. Go Reply is a leading Google Cloud Platform Service integrator (SI) that is helping companies that span multiple sectors along this vital journey. Part of the Reply Group, Go Reply is a Google Cloud Premier Partner focussing on areas to include Cloud Strategy and Migration; Big Data; Machine Learning; and Compliance. With Data Modernisation capabilities in the GCP environment constantly evolving, businesses can become overwhelmed and unsure on not only next steps, but more importantly next steps for them, particularly if they don’t have in-house Google expertise. Companies often need to utilise both Data Lakes and Data Warehouses simultaneously so guidance on how to do this, as well as driving value from both kinds of storage is vital. When speaking to the Go Reply leadership team they advise that Google Cloud Platform being the hyperscale cloud of choice for these workloads, brings technology around Data Lake, and Data Warehouse efficiency, along with security superior to other market offerings.


Three tech trends on the verge of a breakthrough in 2023

The second big trend is around virtual reality, augmented reality and the metaverse. Big tech has been spending big here, and there are some suggestions that the basic technology is reaching a tipping point, even if the broader metaverse business models are, at best, still in flux. Headset technologies are starting to coalesce and the software is getting easier to use. But the biggest issue is that consumer interest and trust is still low, if only because the science fiction writers got there long ago with their dystopian view of a headset future. Building that consumer trust and explaining why people might want to engage is just as a high a priority as the technology itself. One technology trend that's perhaps closer, even though we can't see it, is ambient computing. The concept has been around for decades: the idea is that we don't need to carry tech with us because the intelligence is built into the world around us, from smart speakers to smart homes. Ambient computing is designed to vanish into the environment around us – which is perhaps why it's a trend that has remained invisible to many, at least until now.


CIOs beware: IT teams are changing

The role of IT is shifting to be more strategy-oriented, innovative, and proactive. No longer can days be spent responding to issues – instead, issues must be addressed before they impact employees, and solutions should be developed to ensure they don’t return. What does this look like? Rather than waiting for an employee to flag an issue within their system – such as recurring issues with connectivity, slow computer start time, etc. – IT can identify potential threats to workflows before they happen. They plug the holes, then they establish a strategy and framework to avoid the problem entirely in the future. In short, IT plays a critical role in successful workplace flow in both a proactive and reactive way. For those looking to start a career in IT, the onus falls on them to make suggestions and changes that look holistically at the organization and how employees interact within it. IT teams are making themselves strategic assets by thinking through how to make things more efficient and cost-effective in the long term.


A Comprehensive List of Agile Methodologies and How They Work

Extreme Programming (or XP), offers some of the best buffers against unexpected changes or late-stage customer demands. Within sprints and from the start of the business process development, feedback gathering takes place. It’s this feedback that informs everything. This means the entire team becomes accustomed to a culture of pivoting on real-world client demands and outcomes that would otherwise threaten to derail a project and seriously warp lead time production. Any organization with a client-based focus will understand the tightrope that can exist between external demands and internal resources. Continuously orienting those resources based on external demands as they appear is the single most efficient way to achieve harmony. This is something that XP does organically once integrated into your development culture. ,,. Trimming the fat from the development process is what this method is all about. If something doesn’t add immediate value, or tasks within tasks seem to be piling up, the laser focus of Lean Development steps in.



Quote for the day:

"Confident and courageous leaders have no problems pointing out their own weaknesses and ignorance. " -- Thom S. Rainer

Daily Tech Digest - November 24, 2022

The Future Of The Metaverse Is MultiChain - But Its Users Must Be Unchained

It’s an extremely unrealistic scenario that will probably never happen. Every time a shiny new blockchain platform for the metaverse arrives, loaded with promise, it leads to the arrival of new flaws and problems that must be overcome. New chains are made to solve these new issues, and yet more problems arise. And so the cycle of innovation goes on and on. These new blockchains do offer some interesting solutions and benefits, hence they become fertile breeding grounds for numerous developers to experiment with new metaverse projects and decentralized applications. That fuels greater interest in the metaverse, prodding users to further explore the world of Web3 and the multitude of other blockchains that support it. Different chains have different strengths and play host to different metaverse worlds, and users soon understand the need to be able to move across these networks. Mobility across chains is a must, as it is the only thing that enables true metaverse freedom. At present, so-called blockchain bridges are the go-to mechanism for moving digital assets across chains. 


3 Reasons the Cloud is Critical for Ensuring Patient-Centered Care

One of the keys to delivering patient-centered care is gaining a full understanding of the individual and actively engaging them in their outcomes. But for this to become a reality, patients and their providers need access to medical records and open lines of communication matched to patient preferences. The cloud enables centralized access to patient records, giving healthcare providers a full view of the tests, diagnoses, treatments and other patient information. The addition of artificial intelligence (AI) in cloud-based solutions is enhancing care by analyzing vast amounts of patient data and offering insights that aid in clinical decision-making, which can result in faster time-to-treatment and better outcomes. ... In the past, connecting with physicians outside of appointments often turned into a game of phone tag and resulted in multiple voicemails. But with the cloud supporting patient portals, video chat and text messaging, patients and their providers have more ways to communicate. They can quickly exchange information or ask and answer questions electronically at their convenience, and refer back to those messages should they forget something. 


Nearly Half Of CIOs Say Digital Transformations Are Incomplete

In the fully digital era, when employees have the flexibility to work from anywhere, eliminating data silos will be key to improving organization-wide collaboration. Part of the challenge is completely digitizing an organization’s documents and workflows. The other piece is creating a central repository for digital data – one that’s universally accessible to and shared by multiple teams within an organization. According to a recent study by S&P Global Market Intelligence, only 17% of those surveyed said that their organizations had a single source of truth where everyone could access the knowledge they created. If businesses want to realize the benefits that come with fully digital processes, there’s still more work to do. Investing in full digitization, including a unified document system, will make the most of an organization’s data while significantly reducing costs and enhancing collaboration, improving interactions between team members wherever they may be located.


Coding is Dead, Long Live Programming!

No-code platforms usually target industry-specific functions, with their primary audience being non-technical users who wish to optimise their business operations. Low-code platforms, on the other hand, typically target developers who are operating under a time constraint. They are used to deliver applications quickly and conserve resources which can be better utilised to create something that has more impact on the organisation’s bottom line. Since no-code platforms usually target a line of business users, who look to create applications that can be used to expedite their business operations, they provide a host of benefits. Not only do these platforms open up access to creating code-based solutions to non-codes, the feature-rich nature of modern no-code environments allow developers to solve problems unique to their line of work. In addition to this, no-code platforms can also be used to automate workflows, thereby saving more resources. As with any upcoming market, there are challenges associated with no-code and low-code platforms. 


How can business prepare for changes in data legislation?

The role of the Data Protection Office (DPO) is also likely to change, as rules for smaller organisations in particular are loosened as the government tries to create some advantages of Brexit for smaller business which economic data suggests are somewhat thin on the ground right now. The panel all expressed concerns about how DPOs will potentially be replaced with Senior Responsible Individuals (SRIs) who will have the seniority but not necessarily the in depth knowledge necessary for the role. Patrick Burgess, Co-Founder & technical Director of MSP Nutborne Ltd. commented: "Already in the non-enterprise world you often find people are nominated as DPO and they aren't necessarily trained. That person needs to be supported at the highest level or it really is just a box ticking exercise. You have to give people the right powers, responsibilities and training and not get cross when they tell you what you don't want to hear." None of these issues are necessarily going to be resolved by swapping out SRI's for DPOs, although they are, as Kon pointed out, theoretically harder to fire if they sit at board level.


DeveloperWeek Enterprise: Sorting Out Big Data to Empower AI

“In many cases, big data is a big data swamp,” he said in his presentation, “The Big Data Delusion - How to Identify the Right Data to Power AI Systems.” The problem, he said, comes from traditional analytical systems and approaches being applied to outsized amounts of data. For example, an unnamed fintech company that was a customer of EastBanc had huge datasets of its customer data, transactional data, and behavioral data that was cleaned by one team then transferred to another team that enhanced the data. While such an approach may be sufficient, Shilo said it can also slow things down. The fintech company, he said, wanted a way to use its data to predict which of its customers would be receptive to contact. The trouble was it seemed to be a herculean task under traditional processes. “Their current team looked at the task and estimated the effort would take four, five months to complete,” Shilo said. “That’s a lot of time.”


Trends in data centre sustainability

Finch is perplexed by the sudden attention paid to data centre sustainability, given that it’s always been embedded in the DNA of Kao Data, which operates three sites in outer London. Says Finch: “We have always tried to do things in a sustainable manner. It’s as if everybody’s just woken up and smell the coffee. Sustainability has always been in the foundations of Kao Data at its very core.” Sustainability goes hand in hand with reliability, he says. You are not such a hostage to yoyoing energy prices if you use renewables. Also, sustainability reduces both short-term capital expenditure and long-term operational expenditure. Finch says: “Really you end up ticking all the boxes, and it makes complete economic sense to do things in a sustainable manner.” Data centres need to have backup diesel generators in the event of power network outage. Many data centres handle critical Government infrastructure, such as the NHS. One data centre handles Government communications with the nuclear submarine fleet – and infrastructure doesn’t come more critical than that.


Microsoft: Popular IoT SDKs Leave Critical Infrastructure Wide Open to Cyberattack

It took some digging to identify that the Boa servers were the ultimate culprit in the Indian energy-sector attacks, the researchers said. First they noticed that the servers were running on the IP addresses on the list of indicators of compromise (IoCs) published by Recorded Future at the time of the release of the initial report last April, and also that the electrical grid attack targeted exposed IoT devices running Boa, they said. Moreover, half of the IP addresses returned suspicious HTTP response headers, which might be associated with the active deployment of the malicious tool that Recorded Future identified was used in the attack, the researchers noted. Further investigation of the headers indicated that more than 10% of all active IP addresses returning the headers were related to critical industries — including the petroleum industry and associated fleet services — with many of the IP addresses assigned to IoT devices with unpatched critical vulnerabilities. This highlighted "an accessible attack vector for malware operators," according to Microsoft.


India drafts new privacy bill for transfer of personal data internationally

"Cross-border interactions are a defining characteristic of today’s interconnected world," according to an explanatory note from the government accompanying the bill. "Recognising this, it has been provided in the bill that personal data may be transferred to certain notified countries and territories." ... “The Central Government may, after an assessment of such factors as it may consider necessary, notify such countries or territories outside India to which a Data Fiduciary may transfer personal data, in accordance with such terms and conditions as may be specified,” according to the draft. A data fiduciary, according to the draft, could be any person or a group of persons who determines the purpose and means of processing personal data. The draft Digital Personal Data Protection Bill, for which the ministry of electronics and information technology has invited feedback from the public via a portal till December 17, also lays out the exemptions and conditions that must be considered when considering the transfer of personal data to other nations.


Examining low-code/no-code popularity across Africa and its range of disruption for CIOs

Dagadu appreciates he could do without a developer to use these tools even though he initially used one, which incurred a lot of cost. “At first we had to employ a web developer who did a lot of work, but it looks horrible using technology like Square Space,” he says. “In Africa, development costs are very high and can only be tackled by companies with a lot of funding. But with the growth of low-code/no-code, more people with bright ideas can bring them to life without the need for expensive developers.” He noted that because of the popularity problem in Africa of these tools, people believe that every time they have an idea to implement an application or technology, they have to resort to an application developer. But by coding less or not at all, there’s an easier entry into hard code according to WenakLabs’ Assani. “It’s a way to be visible quickly, to offer your services to the world without resorting to the skills of a developer. Above all, you learn through experimentation.”



Quote for the day:

"And how does one lead? We lead by doing; we lead by being." -- Bryant McGill

Daily Tech Digest - November 23, 2022

What's coming for cloud computing in 2023

Enterprises often move to multicloud on purpose, but way more often multicloud just happens as enterprises strive to find and leverage best-of-breed cloud services with no plan for what to do with those services after deployment. This leads to too much cost and not enough return of value to the business. Old story. This cloud complexity problem can be solved through the strategic use of technology and better approaches to manage the complexity. Most important is reducing redundancy by using a common layer of technology above the public cloud providers as well as above any legacy or edge-based systems. This layer includes common services, such as a single security system, a single data management system, finops, a single cloud operations system, etc. We’re not attempting to solve every problem within the “walled garden” of each public cloud provider; this technology should exist within a common layer, aka supercloud or metacloud. This strategic cloud trend not only solves the complexity problems by leveraging common services and a common control plane, it also helps get cloud costs under control through a common finops layer that handles cost monitoring, cost governance, and cloud cost optimization.


Best practices for implementing a company-wide risk analysis program

The first step is determining what is critical to protect. Unlike accounting assets (e.g., servers, laptops, etc.), in cybersecurity terms this would include things that are typically of broader business value. Often the quickest path is to talk with the leads for different departments. You need to understand what data is critical to the functioning of each group, what information they hold that would be valuable to competitors and what information disclosures would hurt customer relationships. Also assess whether each department handles trade secrets, or holds patents, trademarks, and copyrights. Finally, assess who handles personally identifiable information (PII) and whether the group and its data are subject to regulatory requirements such as GDPR, PCI DSS, CCPA, Sarbanes Oxley, etc. When making these assessments, keep three factors in mind: what needs to be safe and can’t be stolen, what must remain accessible for continued function of a given department or the organization, and what data/information must be reliable (i.e., that which can’t be altered without your knowledge) for people to do their jobs.


What Is Data Virtualization?

The process of data virtualization is quite simple. Data is accessed in its original form and source. Unlike typical “extract, transform, and load” (ETL) processes, virtualization doesn’t require data to be moved to a data warehouse or data lake first. Data is aggregated in a single location, known as a virtual data layer. Using this layer, enterprises can develop simple, holistic, and customizable views (also known as dashboards) for accessing and making sense of data. Using these tools, users can also pull real-time reports, manipulate data, and perform advanced data processes such as predictive maintenance. Data is easily accessible via dashboards from anywhere. ... While data is critical to the decision-making process, not just any data will do. The data used must be accurate, up-to-date, and logical. It must also be displayed in a way that all stakeholders can understand, whether a user is a data scientist or a C-level executive. Data virtualization enables stakeholders to access the specific data they need when they need it. Because data isn’t just a replication from any given time, all data is accurate to the minute. 


LockBit 3.0 Says It's Holding a Canadian City for Ransom

LockBit operators posted screenshots showing files of different departments and other data as a proof for their claim, but Information Security Media Group was unable to immediately contact the municipality and confirm the authenticity of the documents. The attack comes on the heels of a new National Cyber Threat Assessment 2023-2024 by the Canadian Center for Cyber Security. The report, which says ransomware is "the most disruptive form of cybercrime facing Canadians," adds that ransomware benefits significantly from the specialized cybercrime economy and the growing availability of stolen information. "So long as ransomware remains profitable, we will almost certainly continue to see cybercriminals deploying it," the report says. The city of Westmount's official website was not affected by the attack, and the municipality says any updates on the recovery will be communicated on the site. The mayor assured residents that data security is its "top priority" and so "is the protection of our residents' and employees' information."


A brief history of industrial IoT

Most early networking technologies were wired: Connection required cables that physically linked your device to the network. Network bandwidth — the amount of data that can be conveyed in a period of time — for 10BASE-T Ethernet connections, one of the most widely used standards established in the late 1980s and early 1990s, allowed for as much as 10 Megabits of data per second. In contemporary times, wired networks support connections of 1,000 Megabits of data per second (1000BASE-T or 1 Gigabit) or even 10 Gigabits of data per second (10GBASE-T) for modern Ethernet connections. Wireless and cellular networking, which eliminated the need for a cable to each device, was a significant shift for IIoT. Standardized in 1999, 802.11b was one of the first standards supported in products from many manufacturers and was a predecessor to the Wi-Fi 6E standard established in 2020. Modern Wi-Fi devices not only offer speeds anywhere from 50 to 800 times as fast as earlier equipment, but the devices may also perform reliably in much more dense radio environments than their predecessors.


How to Avoid Risks Before Implementing Industrial IoT Solutions

Industrial IoT solutions are often implemented at Enterprises with a high proportion of machine manufacturing. For a well-funded company, it is often easier to implement the IoT ecosystem using modern equipment. But for some, it would be too expensive to replace legacy manufacturing systems. Therefore, companies often choose to adapt existing equipment and enhance it with sensors, smart devices, and gateways. However, when choosing to implement IoT technology in an enterprise equipped with old machines, the company has to ensure protocols are understandable for all the devices to connect disparate data stores, and solve all the compatibility issues. According to McKinsey, a company moving to EIoT has to solve compatibility issues for about 50% of all devices. If compatibility issues are not solved appropriately, the solution may not function as intended, or even at all. The wrong algorithm or incorrect integration can lead to hardware malfunctions and equipment damages, overheating, explosion, or system failure. 


How remote working impacts security incident reporting

The risks of an impeded reporting process due to remote working are significant. When incidents go unreported, reports are delayed/miscommunicated or follow-up actions/responses are hindered, it can leave vulnerabilities exposed and/or buy attackers time in the system to infiltrate more of the network before the security team can detect and contain threats and malicious activity, Chavoya warns. This can not only exacerbate the severity of incidents and attacks but can also damage both the reputation of a business and its ability to meet certain data protection regulations which stipulate strict rules surrounding disclosure. These could lead to loss of customer confidence and large monetary penalties. It is therefore paramount for security teams to update their reporting policies and processes to account for the security implications of remote working. “The home and hybrid working trend is here to stay, so it is incredibly dangerous for security teams to rely on policies and processes designed for a bygone era when most, if not all, employees were based in a controlled office environment,” says Holyome. 


IT leadership: 5 ways to create a culture of gratitude

Expressing gratitude is an integral part of a healthy culture. I think it starts with a leader maintaining healthy personal humility and respect and empathy for their staff, so that gratitude is coming from a genuine place. Thank-yous should be prompt, specific, and connect the accomplishment to its impact on our mission of educating students. Thanking a team for finishing a project, as in: “Your team successfully implemented this project, which I really appreciate” is more powerful when it adds, “The new UI will help our students better determine what classes they still need to take in order to graduate.” It’s helpful to give customer feedback as well, such as “I talked with an adviser who says this will really help her more accurately advise students.” IT teams always see a steady stream of problem tickets, so hearing how their work is impacting students and faculty, and/or hearing verbatim feedback from delighted users, can be very encouraging. In addition to thanking employees individually, department emails and all-staff meetings and parties should all include recognition and gratitude for recent accomplishments, and a little free food and swag never hurts, either.


5 pitfalls to avoid when partnering with startups

For Bedi, it came as a rude shock when he found out a startup he was working with on a project didn’t have an internal development team and instead relied on a third party for its deliverables. “We had partnered with a startup on a customer onboarding project. A delay of 15 to 20 days is acceptable but alarm bells ring when there is a significant overrun of timelines. In our case, there was a delay of more than two months,” says Bedi. “Not only a lack of bandwidth but also the brief that the startup receives from the enterprise and passes to the third party gets lost in translation. It doesn’t help that the startup didn’t read the detailed business requirements document.” Unfortunately, it’s tough to cut this risk altogether, Bed says. “There are few IT leaders who verify the credentials of a startup to the extent of asking the CVs of their team members. Even if some do so, some startups resort to ‘body shopping,’” he says, referring to the practice of recruiting workers to contract their services out on a tactical short- to mid-term basis. So, what’s the way out? The best approach is to open a clear line of communication with the startup and ensure transparency. 


Implications of Emerging Technology on Cyber-Security

Proper understanding of the new technologies is very important; this includes risk assessment and evaluation of the new technology, followed by proper planning for implementation and risk mitigation. Risks are changing much faster than organisations can mitigate them. Unfortunately, there is no silver bullet for cyber-security, but there are three areas that must be carefully planned: Organizations must ensure they understand the risks of any new technology they install, as this will be key to properly securing it. As a result, training and education on the new technology is a cornerstone to build on, and this is not just for technology people but for everyone involved who works with critical data and new technologies. Although ultimate accountability will still rest with the organization’s senior management, the information security team has the responsibility to study the new technology well and evaluate the associated risks. The primary goal is to foster an organisational culture that encourages both risk-based decision making and innovation and new technology adoption.



Quote for the day:

"Leadership does not always wear the harness of compromise." -- Woodrow Wilson

Daily Tech Digest - November 22, 2022

Multimodal Biometrics: The New Frontier Against Fraud

This new category includes instances when online criminals directly target consumers, often through a text, call, or email, rather than by obtaining a person’s personal information at the institutional level, a change in tactics in recent years that has significant consequences for both individuals and the companies they do business with. The consumer, Javelin says, has become “the path of least resistance.” Consumers aren’t the only ones affected by this change in approach. It has significantly altered the advice we give our banking and financial services customers, as well. ... Identity verification platforms with multi-modal biometrics and liveness detection offer next-generation levels of security. Even better, platforms now entering the market combine multi-modal biometrics and liveness detection with a frictionless, easy-to-use interface. With some, customers simply look into their phones or laptop cameras and say a phrase to easily and securely access an online account. This is the conversation my colleagues and I are having with our banking and financial institution customers.


The 5 Most Dangerous Cognitive Biases For Startup Founders

Confirmation bias is the tendency to search for information proving your already-established worldview, rather than disproving it. It is obvious that it’s crucial to try to avoid this when constructing your idea or product validation tests or when talking to customers. Don’t try to defend your assumptions and decisions - instead, try to gather unbiased feedback so that you would have a higher confidence level in the results of your tests. Fake confirmation of your ideas might make your life easier as it would give you a scapegoat for your failure. Yet, in the long run, it’s much better to have to overcome your ego and succeed than to defend it but ultimately fail. The tendency to rely heavily on the first piece of information you have on a topic. The anchoring bias is often used in negotiations as a trick to bring the expectations of the opposing party closer to your desired outcome. In startups, it is very important not to unwittingly play this trick on yourself. For example, if you’ve been offering a service for free you might feel reluctant to raise the price significantly even if it is the right thing to do for your business.


The rise of metaverse shopping

Even as the metaverse continues to gain popularity, it’s important for retailers to remember that it is still relatively new, she observed. “The reality is there are so many other channels for retailers to engage customers, such as web, mobile, in-store and social, and they need to also focus on strengthening those experiences,” Estes said. Brands should not be trying to match virtual experiences with traditional in-store experiences, Mason noted, as they are very different mediums and have different strengths for connecting with customers. “The key thing to remember is that metaverse experiences are new and opt-in,” he said. “They need to be fun and engaging for the user to find something worthwhile in them. After all, moving to a competing brand’s metaverse experience is just a click or a hand-wave away. It is important for companies to consider how their brand will translate to a new medium.” Brands should consider how their brand representatives will greet consumers. Will they be serious, fun or edgy? What kind of language and voice will be used, and how will their brand avatar present itself visually?


How intelligent automation will change the way we work

As organizations automate their business processes, there are many potential hazards to avoid. “The main one is ignoring your people and underestimating that,” Butterfield said. “Although the outcome is driven by using a technology, everything up to the actual automation of a process is generally very people-focused. A lack of change management will unfortunately cause many issues in the long term. Organizations need to keep their people aligned with their overall goals.” Security, mainly authentication, is also a key concern, Barbin said. “Any automation, API [application programming interface] or other, requires some means to pass access credentials,” he said. “If the systems that automate and contain those credentials are compromised, access to the connected systems could be too.” To help minimize that risk, Barbin suggests using SAML 2.0 and other technologies that take stored passwords out of the systems. Another pitfall is selecting only one technology as the automation tool of choice. Typically organizations need multiple technologies to get the best results, said Maureen Fleming, program vice president for intelligent process automation research at IDC.


How can IT leaders address ‘quiet quitting’?

While this is less likely to be an issue if staff are driven by the organisation’s vision and purpose, as is often the case with tech startups, it is still “important to look at what the expectations are on both sides, what’s reasonable and where compromises could be made”, she says. Klotz also suggests that part of the reason why some IT leaders, among others, have reacted so negatively to the idea of quiet quitting idea is over concerns that “paying extra for everything” could hit profit margins, which in turn could put the company out of business, particularly in economically difficult times. But he also points to the dynamic nature of the tech industry, which requires discretionary working at times simply to deliver on projects. “It’s only if you ask people to go above and beyond without compensation that it gets exploitative rather than being part of a healthy functioning relationship,” Klotz says. “But many companies ask employees to do extra almost as part of the job description, which is partly why they provide amazing benefits and such good compensation – people know what they’re getting into and are rewarded for it.”


Applying Enterprise Risk Management to Cyberrisk

Both the reality of cyberthreats and regulatory changes should make it clear to boards, owners and management that there is a need for better management of cybersecurity. Enterprise risk management (ERM) is a tool that management and the board can use to help manage risk across the enterprise, including cyberrisk. The Committee of Sponsoring Organizations of the Treadway Commission (COSO) ERM framework and International Organization for Standardization (ISO) 31000 are two prominent frameworks for ERM. Both frameworks emphasize that for effective ERM, an organization needs to have oversight from senior management, organizational structure to support ERM and qualified staff. These and other capabilities that are needed to support ERM are also necessary to support cybersecurity and manage cyberrisk; therefore, the contents of both frameworks are easily and aptly applied to cybersecurity. Organization can learn about the consequences of ineffective enterprise management of cybersecurity from many examples around the world including the 2021 ransomware attack on Ireland’s Health Services Executive (HSE). 


Why to Rethink and Update Approaches to Payment Security Management

“CISOs are increasingly challenged in their efforts to secure payment security compliance, and in convincing board members and other stakeholders of the importance and significance of securing strategic support and resources,” Hanson explains. In the 2022 Payment Security Report, it's pointed out how CISOs are often using outdated methods to secure support, and a change is needed for all stakeholders in approach. “Rather than taking a check-the-box approach to compliance, CISOs and other security leaders need to take an out-of-the box, thinker’s approach that involves implementing frameworks and models,” Hanson says. “This is especially true for those taking the Customized approach to compliance.” MacLeod says there are several key stakeholders in organizations who ensure payment security compliance, from the CEO and CIO across to the CISO and CFO -- and these roles are changing as the payments industry evolves. ... As a result, stakeholders such as the CIO and CISO are playing an increasingly important role in ensuring payment security compliance.


Five defence-in-depth layers to implement for business security success

Businesses have many wonderful applications at their fingertips, with the average user having access to 5-10+ high-value business apps. These contain sensitive resources such as customer information, intellectual property, and financial data, making them a key target for attackers. Unfortunately, 80 per cent of businesses have faced users misusing or abusing these apps in the last year. Simply requiring a login is not enough to keep them safe – the moment a user steps away from their screen while still logged in, all of that valuable data is exposed. The defensive layer: a login only verifies a user’s identity at one point – so effective security controls here will continue to monitor, record, and audit user actions after authentication. Enhancing the visibility available to security teams offers many benefits, including being able to identify the source of a security incident (and therefore respond) much quicker. ... Almost all businesses benefit from using third party tools, but they offer risks too, as integration often requires creating super-user access to clients’ systems. 


The future of IT: decentralization and collaboration

As the role of IT evolves and collaboration increases, IT leaders are increasingly working as partners – rather than technology gatekeepers – with department heads. This collaboration and decentralization of IT across the enterprise gives employees self-sufficiency and autonomy when making technology decisions for their departments. They no longer depend on the IT team for their process automation, tool choices, or technology operations. ... IT personnel must clarify to all employees which applications are allowed on the corporate network. Employees should always inform IT personnel about their use of non-sanctioned applications and devices. If employees are downloading non-sanctioned apps and using non-sanctioned devices to access the corporate network, the IT department may have trouble preventing malware from accessing the network. When employees are open and honest about the devices and applications they use, it is much easier for IT personnel to mitigate rogue downloads and keep the network safe. Also, with social engineering efforts on the rise, IT must teach all other employees about popular attack methods, such as phishing and business email compromise.


Craftleadership: Craft Your Leadership as Developers Craft Code

There are other common practices in software development that apply to management. First, organize your budgeting process as a CI/CD pipeline. Make budget definition something that is easily repeatable, and that fits in your organization. CI/CD allows you to get rid of fastidious tasks by putting them in a pipeline. Budgeting is one of the most fastidious things I have found I have to do as a manager. Second, master your tools. If MS Excel is the tool used by the managers in your organization, be an Excel master. Third, try to be reactive in your decisions, as in reactive programming. Be asynchronous when making decisions; as much as possible, try to reduce the “commit” phase, that is, the meetings where everyone must be present to say they agree. In my case, I think that it is necessary to maintain these meetings where everybody agrees on different things. Yet, in these meetings, I never address an issue that I haven’t had the time to discuss thoroughly with everyone beforehand- this could be through a simple asynchronous email loop where everyone had a chance to give his or her opinion.



Quote for the day:

"Successful leadership requires positive self-regard fused with optimism about a desired outcome." -- Warren Bennis

Daily Tech Digest - November 21, 2022

Achieve Defense-in-Depth in Multi-Cloud Environments

Many organizations are adopting log-based solutions (from endpoint to perimeter security), which is a good first step, but logs can be bypassed or disabled. Even worse, hackers can manipulate logs to give the appearance that “everything is fine,” when in fact, they are moving between users, resources and exfiltration. The solution to this problem is to normalize visibility across the locations where your organization’s data lives – from the cloud to on-prem, and data centers. Knowing that IT and Security teams rely on logs makes them attractive targets for hackers today. However, taking a defense-in-depth approach versus logs alone is now critical to ensuring that every single entry point to your organization is secure. Network intelligence plays a huge role in gaining visibility – it is the only way to ensure visibility into all of the data in motion across your entire infrastructure and prevent risks. ... Just like cloud infrastructure management is a shared responsibility within the organization, so must enterprise security including data security be a shared responsibility. 


A Serverless-First Mindset in an Evolving Landscape

A serverless-first mindset is no doubt beneficial in a number of ways, but some businesses may have reservations in terms of the potential for vendor lock-in, the security offered by the cloud provider, existing sunk costs and other issues in debugging and development environments. However, even among the most serverless-adverse, this mindset can provide benefits to a select part of an organisation. When looking at a bank’s operations for example, the continued uptime of the underlying network infrastructure is crucial for database access, and with a serverless-first mindset, employees have the flexibility to develop consumer-facing apps and other solutions as consumer demand increases. While the maintenance of a traditional network infrastructure is crucial for uptime of the underlying database, with a serverless approach they have the freedom to implement an agile mindset with consumer-facing apps and technologies as demand grows. Agile and serverless strategies typically go hand-in-hand, and both can encourage quick development, modification and adaptation.


IT talent: The 3 C's for life/work balance

Compensation and benefits are not just lifestyle issues. Although these have virtually nothing to do with how much we enjoy our time at work or how far and fast we advance our careers, they carry a lot of psychological value in our culture because they feed ego and self-esteem. Few people who love their job, have great career prospects, work for a wonderful boss, and have a short commute will move simply for the money. Conversely, many are looking to leave high-paying jobs because their boss is a jerk, the commute is too long, or their skills are outdated. Many candidates initially cite compensation as their top criterion to make a move. Still, I have yet to meet a candidate who would accept a position sight unseen without knowing specific details of the job’s other C's. Big money or great benefits have never made a bad job good. Compensation comes to mind first because it is tangible, measurable, and has psychological power, but underlying its number-one ranking is the assumption that all the other criteria are met. Like everything else, compensation and benefits for a specific role are determined by an ever-changing marketplace.


Extortion Economics: Ransomware's New Business Model

This industrialization of cybercrime has created specialized roles in the RaaS economy. When companies experience a breach, multiple cybercriminals are often involved at different stages of the intrusion. These threat actors can gain access by purchasing RaaS kits off the Dark Web, consisting of customer service support, bundled offers, user reviews, forums, and other features. Ransomware attacks are customized based on target network configurations, even if the ransomware payload is the same. They can take the form of data exfiltration and other impacts. Because of the interconnected nature of the cybercriminal economy, seemingly unrelated intrusions can build upon each other. For example, infostealer malware steals passwords and cookies. These attacks are often viewed as less serious, but cybercriminals can sell these passwords to enable other, more devastating attacks. However, these attacks follow a common template. First comes initial access via malware infection or exploitation of a vulnerability. Then credential theft is used to elevate privileges and move laterally.


7 Microservice Design Patterns To Use

Saga pattern - This microservice design pattern provides transaction management using a sequence of local transactions. Each operation part of a saga guarantees that all operations are complete, or that the corresponding compensation transactions are run to undo the previously done work. Furthermore, in Saga, a compensating transaction should be retriable and idempotent. The two principles ensure that transactions can be managed without manual intervention. The pattern is also a way of managing data consistency across microservices in distributed transaction instances. ... Event Sourcing - Event sourcing defines an approach to handling data operations driven by a sequence of events, each of which is recorded in an append-only store. The app code sends a series of events that describe every action that happened on the data to the event store. Typically, the event store publishes these events so consumers can be notified and handle them if required. For instance, consumers could initiate tasks that apply the events operations to other systems or do any other action associated needed to complete an operation. 


Enterprises embrace SD-WAN but miss benefits of integrated approach to security

When asked to list the challenges they faced when taking a do-it-yourself (DIY) approach to SD-WAN, respondents cited difficulties related to hiring and retaining a skilled in-house workforce, keeping up with technology developments and the ability to negotiate favourable terms with technology vendors. “Now that SD-WAN has matured and has been widely adopted, the complexity of deployments has grown, challenging enterprises on multiple fronts and compromising their ability to realise the full benefits of the technology,” said James Eibisch, research director, European infrastructure and telecoms, at IDC, commenting on the study. “Enterprises are increasingly reliant on the resources and expertise of a managed service provider to ensure they deploy SD-WAN in a way best suited to their meet their organisations’ objectives. Security approaches like secure access service edge (SASE) that combine the benefits of SD-WAN with zero-trust network access and content filtering features are well poised to dominate the next phase of SD-WAN enhancements as enterprises continue to enable the cloud IT model and a hybrid workforce,” he added.


Quantum computing: Should it be on IT’s strategic roadmap?

Quantum computing is a nascent field. Few companies are planning to purchase quantum computers, but there are companies that are starting to use them for competitive advantage. For this reason alone, quantum computing should have a place on IT strategic roadmaps. Financial services institutions like banks and brokerage houses are beginning to experiment with quantum computing as a way to process large volumes of financial transactions quicker. Quantum computing can also be used for financial risk analysis, as financial services companies are using quantum computing for fraud detection. Quantum computing can be used to determine worldwide supply chain risks such as weather, strikes and political unrest, with an eye toward eliminating supply chain bottlenecks before they happen. Pharmaceutical companies are experimenting with quantum computing as a way to assess the viability of new drug combinations and their beneficial and adverse effects on humans. The goal is to reduce R&D costs and speed new products to market. They are also to customize drugs to each individual patient’s situation.


Big Tech Layoffs: A Flood of Talent vs the Hiring Crisis

There has been a sea change in the prospects certain big tech players anticipated would continue to buoy their sector. Sachin Gupta, CEO of HackerEarth, says many big tech and social media platforms saw explosive growth when the pandemic changed spending patterns and drove moves to work remotely and conduct more activities online. “What the businesses started thinking was this was going to last forever, which is very natural,” he says. It is very difficult to be in the midst of such a wave, he says, and then predict that it would not continue. The reasons behind the recent layoffs and firings differ, of course. Meta’s troubles include not seeing expected traction -- such as its exploration of the metaverse. Meanwhile, Twitter is in the throws of a regime change that has been acrimonious for at least some of the rank and file of the company, which has seen sweeping layoffs, resignations, and outright firings of personnel new CEO Elon Musk no longer wanted to darken the company’s door -- office doors that Musk abruptly ordered to be shut (temporarily) and locked last week even to remaining employees.


Creating an SRE Practice: Why and How

The most important first step is to adopt the SRE philosophies mentioned in the previous section. The one that will likely have the fastest payoff is to strive to eliminate toil. CI/CD can do this very well, so it is a good starting point. If you don't have a robust monitoring or observability system, that should also be a priority so that firefighting for your team is easier. ... You can't boil the ocean. Everyone will not magically become SREs overnight. What you can do is provide resources to your team (some are listed at the end of this article) and set clear expectations and a clear roadmap to how you will go from your current state to your desired state. A good way to start this process is to consider migrating your legacy monitoring to observability. For most organizations, this involves instrumenting their applications to emit metrics, traces, and logs to a centralized system that can use AI to identify root causes and pinpoint issues faster. The recommended approach to instrument applications is using OpenTelemetry, a CNCFsupported open-source project that ensures you retain ownership of your data and that your team learns transferable skills.


The Challenge of Cognitive Load in Platform Engineering

You must never forget that you are building products designed to delight their customers - your product development teams. Anything that prevents developers from smoothly using your platform, whether a flaw in API usability or a gap in documentation, is a threat to the successful realisation of the business value of the platform. With this lens of cognitive load theory, delight becomes a means of qualifying the cognitive burden the platform is removing from the development teams and their work to accomplish their tasks. The main focus of the platform team, as described by Kennedy, is "on providing “developer delight” whilst avoiding technical bloat and not falling into the trap of building a platform that doesn’t meet developer needs and is not adopted." She continues by noting the importance of paved paths, also known as Golden Paths: By offering Golden Paths to developers, platform teams can encourage them to use the services and tools that are preferred by the business. 



Quote for the day:

"Leadership is familiar, but not well understood." -- Gerald Weinberg

Daily Tech Digest - November 20, 2022

AI experts question tech industry’s ethical commitments

For Wachter, the “cutting costs and saving time” mindset that permeates AI’s development and deployment has led practitioners to focus almost exclusively on correlation, rather than causation, when building their models. “That spirit of making something quick and fast, but not necessarily improving it, also translates into ‘correlation is good enough – it gets the job done’,” she says, adding that the logic of austerity that underpins the technology’s real-world use means that the curiosity to discover the story between the data points is almost entirely absent. “We don’t actually care about the causality between things,” says Wachter. “There is an intellectual decline, if you will, because the tech people don’t really care about the social story between the data points, and social scientists are being left out of that loop.” She adds: “Really understanding how AI works is actually important to make it fairer and more equitable, but it also costs more in resources. There is very little incentive to figure out what is going on [in the models].” Taking the point further, McQuillan describes AI technology as a “correlation machine” that, in essence, produces conspiracy theories.


The Next Generation of Supply Chain Attacks Is Here to Stay

As the vast majority of the workforce has gone digital, organizations' core systems have been moving to the cloud. This accelerated cloud adoption has exponentially increased the use of third-party applications and the connections between systems and services, unleashing an entirely new cybersecurity challenge. There are three main factors that lead to the rise in app-to-app connectivity: Product-led growth (PLG): In an era of PLG and bottom-up software adoption, with software-as-a-service (SaaS) leaders like Okta and Slack; DevOps: Dev teams are freely generating and embedding API keys in. Hyperautomation: The rise of hyperautomation and low code/no code platforms means "citizen developers" can integrate and automate processes with the flip of a switch. The vast scope of integrations are now easily accessible to any kind of team, which means time saved and increased productivity. But while this makes an organization's job easier, it blurs visibility into potentially vulnerable app connections, making it extremely difficult for organizational IT and security leaders to have insight into all of the integrations deployed in their environment, which expands the organization's digital supply chain.


Cultivating social emotional learning in the metaverse

Interactions and learning trigger feelings and emotions. There is a need to develop emotional awareness, to pause and notice the emotional signals of the body. The practice of pause – the conscious allotting of space and time to look inwards and notice physical sensations like a ‘racing pulse’, a ‘shaking leg’ or a ‘clammy hand’ is a must for well-being. When things seem to be falling apart, it is useful to breathe. Evidence suggests that, by counting our breaths and centring our breathing, we calm our minds. Whether dealing with difficult conversations with colleagues, family, friends, teachers or students, the ability to regulate emotion and attention is a well-being practice proven to mitigate accompanying anxiety, fear, anger or despair. ... Feeling a pit in one’s stomach or a thumping heart are physical symptoms that often accompany intense emotional responses. At such times, a friend; app; conscious trained practice like counting numbers, breaths or tiles on the floor; time-out or break; or walking can all be good ways to physically distract focus and allow some of the intensity of the emotion to diminish.


How Much Automation Is Too Much?

For many forms of automation, deskilling isn’t a serious problem. Knowledge workers in general, including ops personnel, may face many routine, repeatable tasks in their day-to-day work that don’t require a level of skill that would cause an issue if that skill were lost. All such routine tasks are subject to automation without concern. At the other extreme, organizations may aspire to "lights out" production environments, so fully automated that there’s no reason to keep the lights on, because there are no people on duty. Any organization with such a lights out environment is likely to lose any staff who might be able to fix something if it goes wrong, either via deskilling or attrition. As AI-based automation becomes increasingly sophisticated, therefore, organizations will reach some optimal point where the advantages of automation sufficiently balance any disadvantages. Finding this optimum depends upon the people involved — the skilled workers who must somehow accommodate automation in their day-to-day work. Be sure to listen to the senior-level people who are adept at analogizing. They can solve problems that automation will never be able to solve.
 

Bringing a Product Mindset into DevOps

A product mindset is about delivering things that provide value to our users, within the context of the organisation and their strategy, and do so sustainability (i.e. balancing the now and the future). For the purpose of this article, I will use product thinking, product mindset and product management very much interchangeably. ... In practice this means achieving product-market-fit by balancing what our users need, want and find valuable (desirability), what we need to achieve (and can afford) as an organisation (viability) and what is possible technically, culturally, legally, etc (feasibility), and doing this without falling into the trap of premature optimisation or closing options too early. To give a tiny, very specific, but quite telling example: for the medical device organisation we chose Bash as scripting language because the DevOps lead was comfortable with it. Eventually we realised that the client’s engineers had no Bash experience, but as a .Net shop were far more comfortable with Python. Adding a user-centric approach which is part of a product mindset at an early stage would have prevented this mistake and the resulting rework.


Solving brain dynamics gives rise to flexible machine-learning models

Last year, MIT researchers announced that they had built “liquid” neural networks, inspired by the brains of small species: a class of flexible, robust machine learning models that learn on the job and can adapt to changing conditions, for real-world safety-critical tasks, like driving and flying. The flexibility of these “liquid” neural nets meant boosting the bloodline to our connected world, yielding better decision-making for many tasks involving time-series data, such as brain and heart monitoring, weather forecasting, and stock pricing. But these models become computationally expensive as their number of neurons and synapses increase and require clunky computer programs to solve their underlying, complicated math. And all of this math, similar to many physical phenomena, becomes harder to solve with size, meaning computing lots of small steps to arrive at a solution. Now, the same team of scientists has discovered a way to alleviate this bottleneck by solving the differential equation behind the interaction of two neurons through synapses to unlock a new type of fast and efficient artificial intelligence algorithms. 


4 Examples of Microservices Architectures Done Right

Microservices are everywhere in today’s increasingly virtual, decentralized world. 85% of organizations with 5,000 or more employees use microservices in their organization in some capacity as of 2021. Even more tellingly, 0% report having no intention of adopting microservices in their company. Clearly, microservices are here to stay, meaning more and more businesses will be adopting microservices in the coming months and years. This is good news, as microservices are capable of so much. This popularity comes with its own risks, though. Some businesses that integrate microservices into their existing workflow will need help figuring out what to do with them. ... Uber would not be able to exist if not for microservices. Although very brief, their monolithic structure resulted in insurmountable hurdles to their growth. Without microservices, the ride-sharing app couldn’t fix bugs when they occurred, develop or launch new services, or transition to a global marketplace. Their monolithic structure was prohibitively complex, requiring developers to have extensive experience working with the system to make even simple changes.


Chief engineering officer: A day in the life

It’s easy to get caught up in the tactical work, so I make sure to reserve time for high-level plans and strategy. This means following tech news and blogs to stay abreast of the latest in technology trends, keeping an eye on market news, reading the latest analyst research, meeting ups with my peers in our private equity portfolio companies, and more. It’s important for me not just to be on top of the technology, but to understand where we’re taking the business in the future. This kind of thinking is what helped lead to Gartner positioning Boomi as a Leader in the 2021 Gartner Magic Quadrant for Enterprise Integration Platform as a Service (EiPaaS) for eight consecutive years. ... If you’re thinking of getting into software engineering, here’s my advice: Just do it. It’s a high-demand career, and there is a continual lack of strong talent in the industry. You’ll find almost limitless opportunities once you get started. And there has never been a better time to do so, with many technological advancements that have lowered barriers to entry. As you move into management, though, it’s important to remember that your job is no longer to code – it’s to satisfy customer requirements and meet business goals. 


5 unexpected ways to improve your architecture team

A key consideration for large organizations is ensuring that no major component can function with complete autonomy or in a silo. You don't want a squad to be incapable of getting things done without any input from other squads. Quite the contrary, for minor decisions or implementations, each squad is empowered to move as quickly as possible. For major decisions with little or no recourse to make changes later, collaboration is key to the "measure twice, cut once" approach to critical decision-making. This enforced "checks and balances" means that no one chapter can unilaterally stray too far outside our strategic bounds. At a lower level, the Delivery Management squad team members work within other squads but all report to the same manager. Embedding them within other teams helps make every squad's activities as consistent as possible. This allows minimal impact on sprints when, for example, a delivery manager is out on leave, because process alignment is a goal within the Delivery Management function. 


Why companies can no longer hide keys under the doormat

CIOs need to be asking questions to their teams to assess this potential exposure and understand the risk, as well as putting plans in place to address it. Fortunately, recent breakthroughs have been able to eliminate this encryption gap and maintain full protection for private keys. Leading CPU vendors have added security hardware within their advanced microprocessors that prevents any unauthorized access to code or data during execution or afterwards in what remains in memory caches. The chips are now in most servers, particularly those used by public cloud vendors, involving a technology generally known as confidential computing. This “secure enclave” technology closes the encryption gap and protects private keys, but it has required changes to code and IT processes that can involve a significant amount of technical work. It is specific to a particular cloud provider (meaning it must be altered for use in other clouds) and complicates future changes in code or operational processes. Fortunately, new “go-between” technology eliminates the need for such modifications and potentially offers multi-cloud portability with unlimited scale.



Quote for the day:

"The leadership team is the most important asset of the company and can be its worst liability" -- Med Jones