Daily Tech Digest - November 27, 2022

Business Case – Why Enterprise Architecture Needs to Change – Part I

The solution to moving out of the “stone age” is to use a digital end-to-end approach for Architecture content (whether EA or SA), and provide openness and transparency across EA, project, and reusable component Architectures. Just like any digital approach to any business problem, the use of structured data is key. The best-structured data language for Architecture is arguably the ArchiMate notation which has a rich notation covering the depth and breadth of Architecture modelling, and also a rich set of connectors to link elements. ... Even if the new hire has significant experience in the given industry, the new organisation’s IT platform and processes will likely vary greatly from the person’s past experience. It takes several months or longer for new staff to accumulate enough knowledge about how the business and IT platform work to operate effectively without help from other staff and operate effectively. The cost of this knowledge gap is the new person delivering outcomes slower than other staff and consuming time of other staff unnecessarily by simply asking questions like ‘what systems do we have?’, ‘what does the business do?’, ‘how does system X work?’ and so on.


Why API security is a fast-growing threat to data-driven enterprises

API security focuses on securing this application layer and addressing what can happen if a malicious hacker interacts with the API directly. API security also involves implementing strategies and procedures to mitigate vulnerabilities and security threats. When sensitive data is transferred through API, a protected API can guarantee the message’s secrecy by making it available to apps, users and servers with appropriate permissions. It also ensures content integrity by verifying that the information was not altered after delivery. “Any organization looking forward to digital transformation must leverage APIs to decentralize applications and simultaneously provide integrated services. Therefore, API security should be one of the key focus areas,” said Muralidharan Palanisamy, chief solutions officer at AppViewX. Talking about how API security differs from general application security, Palanisamy said that application security is similar to securing the main door, which needs robust controls to prevent intruders. At the same time, API security is all about securing windows and the backyard.

 

Artificial Intelligence Can Enhance Banking Compliance

Technology has changed our society, and banks and other financial institutions have digitalized their operations at a rapid pace as well. However, the financial crime compliance units of these institutions still rely mainly on heavy manual processes. The banking compliance units’ key reason for their cautious approach in the utilisation of AI and automation has been uncertainty about technology. Do regulators approve machine-based decision-making, and is machine learning logic fair in identifying suspicious activities? However, there is a clear need for utilising technology in financial crime compliance. During the last number of years, Ireland has witnessed a rise in financial crime, with illegal proceeds making their way into the financial system, often from international sources. Last month, data from Banking and Payments Federation Ireland showed that over €12m was transferred illegally through so-called ‘money mule’ accounts in the first six months of the year. When compared to the same period last year, the quantity of bank accounts linked to the criminal practice in Ireland almost doubled to 3,000 between January and June 2022.


Big tech has not monopolized big A.I. models, but Nvidia dominates A.I. hardware

Interest in A.I. software startups targeting business use cases also remains formidable. While the total amount invested in such companies fell 33% last year as the venture capital market in general pulled back on funding in the face of fast-rising interest rates and recession fears, the total was still expected to reach $41.5 billion by the end of 2022, which is higher than 2020 levels, according to Benaich and Hogarth, who cited Dealroom for their data. And the combined enterprise value of public and private software companies using A.I. in their products now totals $2.3 trillion—which is also down about 26% from 2021—but remains higher than 2020 figures. But while the race to build A.I. software may remain wide open for new entrants, the picture is very different when it comes to the hardware on which these A.I. applications run. Here Nvidia’s graphics processing units completely dominate the field and A.I.-specific chip startups have struggled to make any inroads. The State of AI notes that Nvidia’s annual data center revenue alone—$13 billion—dwarfs the valuation of chip startups such as SambaNova ($5.1 billion), Graphcore ($2.8 billion) and Cerebras ($4 billion). 


Predictive Analytics in Healthcare

Clinicians, healthcare associations and health insurance companies use predictive analytics to articulate the probability of their cases developing certain medical conditions, similar as cardiac problems, diabetes, stroke or COPD. Health insurance companies were early adopters of this technology, and healthcare providers now apply it to identify which cases need interventions to avert conditions and enhance health outcomes. Clinicians also use predictive analytics to identify cases whose conditions are progressing into sepsis. As is the case with numerous operations of predictive analytics in healthcare, still, the capability to use this technology to read how a case’s condition might progress is limited to certain conditions and far from widely deployed. Healthcare associations also use predictive analytics to identify which hospital in patients are probable to exceed the average length of stay for their conditions by assaying case, clinical and departmental data. This insight allows clinicians to acclimate care protocols to observe the cases’ treatments and recoveries on track. That in turn helps cases avoid overstays, which not only drive up expenses and divert limited hospital resources, but also may endanger cases by keeping them in surroundings that could expose them to secondary infections.


How to Set Yourself Up For Success As a New Data Science Consultant With No Experience

The key is to know what you’re good at and focus on it. Going out on your own as a consultant is scary enough — ensure that you’re going to be marketing and using skills that you’re comfortable with. Having confidence that you can successfully produce results using your tools and skills of choice goes a long way to becoming a successful consultant. Additionally, do some market research to see where your niche could lay. While they say that data scientists should all be generalists in the beginning, I believe that consultants should focus on specializing themselves in niches that complement their skills and their alternative knowledge. For example, I would focus on becoming a data science consultant who specializes in helping companies solve their environmental problems — this would combine my specialized skills (data science) with my alternative knowledge and educational background in environmental science. Companies love working with consultants who have first-hand experience in their sector, so it can’t hurt to play to your strengths, past employment, education, or interest background.


The future of employment in IT sector

Whilst the companies keep up with the changing economic climate, what’s become undeniable is the war for recruiting good talent, now more than ever. There has been a significant change in employees’ needs and priorities. Cream talent is re-evaluating their careers based on aspects like flexibility, career growth and employee value proposition. Companies must therefore invest in ‘Active Sourcing’ to create a rich pipeline and not only recruit them but also train them for the upcoming 4th industrial revolution. It needs to invest in their skills and holistic development, not forgetting to create a safe, healthy work environment to retain the talent. As dynamic as it is, one cannot deny the menace of tech burnout. This blog describes it perfectly, ‘Tech burnout refers to the extreme exhaustion and stress that many employees in the technology sector experience. While burnout has always been an issue in many industries, 68% of tech workers feel more burned out than they did when they worked at an office.’ Technology is the most rapidly evolving industry with a challenging work environment.


On the Psychology of Architecture and the Architecture of Psychology

Most of our intelligence, however, consists of patterns that we execute efficiently, automatically and quickly. Some of these are natural elements, which are fixed: e.g. a propensity to communicate and use tools, to perform ‘mental travel’ — memory, scenarios, fantasy — and all of it based on pattern creation and reinforcement. Some of these elements may even be genetic (like basic strategies such as wait-and-see versus go-for-it you can observe in small children), but most of it is probably learned. All of this is part of Kahneman’s ‘System 1’. We learn by employing our capability to employ logic and ratio and our copying-and-being-reinforced capability — and while we do a lot more of the latter two than the former, culturally, we tend to believe that the reverse is true. Learning by reinforcement also includes learning by doing. Chess grand masters have very effective fast ‘patterns’ in the ‘malleable instinct’ part of their brains, and the difference between grand master and good amateurs is not their power of logic and ratio — calculating, thinking moves ahead — but their patterns that identify potential good moves before they start to calculate , and these patterns come from playing a lot of games. You also have to maintain your patterns: it is ‘use it or lose it’.


7 Common Data Quality Problems

Data inconsistencies: This problem occurs when multiple systems are storing information without using an agreed upon, standardized method of recording and storing information. Inconsistency is sometimes compounded by data redundancy. ... Fixing this problem requires the data be homogenized (or standardized) before or as it comes in from various sources, possibly through the use of an ETL data pipeline. Incomplete data: This is generally considered the most common issue impacting Data Quality. Key data columns will be missing information, often causing analytics problems downstream. A good method for solving this is to install a reconciliation framework control. This control would send out alerts (theoretically to the data steward) when data is missing. Orphaned data: This is a form of incomplete data. It occurs when some data is stored in one system, but not the other. If a customer’s name can be listed in table A, but their account is not listed in table B, this would be an “orphan customer.” And if an account is listed in table B, but is missing an associated customer, this would be an “orphan account.”


Building IT Infrastructure Resilience For A Digitally Transformed Enterprise

At a minimum, resiliency means having stable operations, consistent revenue, manageable security risks, efficient workflows, and an informed and agile employee base. Having visibility over the operating systems of network devices can reduce network downtime and open doors to further efficiencies. If a business is resilient, it can maintain stable network operations, drive down IT costs and deliver a more robust service at a lower cost. Overall, when businesses can dramatically lower IT expenses and have better visibility, they can expend resources on separate projects that improve the quality of service—a win for all. From a regulatory perspective, regulators now want to see everything documented. Take mobile banking, for example; regulators want to know everything, including what code is being used on which servers as well as which people and processes have access to which services. Intelligently automated network operations can allow enterprises to be better equipped to answer the questions that regulators ask, such as how they're validating and how often they're doing a failover. 



Quote for the day:

"A good general not only sees the way to victory; he also knows when victory is impossible." -- Polybius

Daily Tech Digest - November 26, 2022

How automation can solve persistent cybersecurity problems.

Think about the normal day for a security analyst. If we’re expecting them to handle alerts, events that have come up, and new attacks that are happening right now—that’s a lot of new information to look at and assess. How much time do they have to read dozens of RSS feeds, research blogs, industry and government reports, security vendor reports, news websites, and GitHub repositories? Collecting and making sense of all that data becomes crucial, but there’s no way individuals can do this on their own quickly enough. Being able to automate that process so you can get to the information that you are going to use now or later and filter out the noise is essential. Obviously, automation is a fundamental capability to reduce the burden of manual review and prioritization of alerts. But while a recent report on cybersecurity automation adoption finds that confidence in automation is rising, only 18% of respondents are applying automation to alert triage. Automation can also help mitigate risk from vulnerabilities in legacy systems. 


The Board’s Role in Advancing Digital Trust

Boards have reported the use of FAIR provides an organized means through which to identify the value of assets, the design of probable loss scenarios, and providing a means to allocate capital to get the most bang for the buck. Some boards have reported FAIR has been useful in demonstrating to objective third parties, like regulators, that they have been prudent in managing digital risk. Reputational loss is widely considered the largest impact from a cyber incident. Unfortunately, reliable statistics are not yet available, but anecdotal evidence supports the popular belief. ... The ability of the board to appropriately ensure the company has the proper level of cyber resilience requires an understanding of an adverse event but also the total cost of controls, ranging from the mundane to worst-case scenarios. Scenario-based exercises combined with CRQ techniques, like FAIR, provide an objective means to assess materiality, and the most appropriate capital allocation. The allocation of too little capital leaves you exposed, while too much wastes capital that can be better applied other places. Determining the cost of a control is almost always the easy part.


Top employee cybersecurity tips for remote work and travel

Trip or no trip, lock your SIM card. SIM-jacking (or SIM-swapping, unauthorized port-out or “slamming”) is a real and underreported crime where threat actors pretend to be you, contact your wireless provider and “port over” your SIM card to your (their) “new phone.” Imagine someone stealing your entire online life, including your social media accounts. In other words, your phone number is now theirs. All your password resets now run through the threat actor. Considering how many work credentials, social media accounts and apps run through your phone number, the nightmare of this crime quickly becomes evident. If you haven’t already done so, lock down your SIM card with your wireless provider. ... Use two-factor authentication (2FA) everywhere and with everything. When choosing how to receive the authentication code, always opt for token over text as it’s much more secure. At Black Hat 2022, a Swedish research team demonstrated exactly how insecure text authentications are. If a hacker has your login credentials and phone number, text-based authentication simply won’t protect you.


AI accountability held back by ‘audit-washing’ practices

Published under the GMF think-tank’s Digital Innovation and Democracy Initiative, the report said that while algorithmic audits can help correct for the opacity of AI systems, poorly designed or executed audits are at best meaningless, and at worst can deflect attention from, or even excuse, the harms they are supposed to mitigate. This is otherwise known as “audit-washing”, and the report said many of the tech industry’s current auditing practices provide false assurance because companies are either conducting their own self-assessments or, when there are outside checks, are still assessed according to their own goals rather than conformity to third-party standards. “If well-designed and implemented, audits can abet transparency and explainability,” said the report. “They can make visible aspects of system construction and operation that would otherwise be hidden. Audits can also substitute for transparency and explainability. Instead of relying on those who develop and deploy algorithmic systems to explain or disclose, auditors investigate the systems themselves.


7 dos and don’ts for working with offshore agile teams

Many companies create business continuity plans to manage a crisis around key business operations. But these plans may overlook specifics for small offshore development teams or not account for intermittent disruptions to internet, power, or other resources that impact an offshore team’s safety, health, or productivity. “If you’re working with a global, distributed team, you need to accept the responsibilities that come with supporting your workforce—whether they are across the world or seated two desks away,” says Andrew Amann, CEO of NineTwoThree Venture Studio. “This means having a plan in place for when a global crisis limits your team members’ ability to work.” Amann offers several recommendations for developing a practical plan. “Cross-train employees, build relationships with development agencies, plan for difficulties with offshore payments, and make sure you stand behind your distributed teams when they need help,” he says.


Almost half of customers have left a vendor due to poor digital trust: Report

The road to digital trust is not always smooth sailing. The number one IT challenge cited was managing digital certificates, rated as important by 100% of enterprises, while regulatory compliance and handling the massive scope of what they are protecting was deemed important by 99% of respondents. Other challenges cited in the research included the difficulty of securing a complex dynamic, multivendor network, and a lack of staff expertise. The report also p oint sout that many common security practices have yet to be implemented. ... For companies still looking for ways to improve digital trust, DigiCert recommends making it a strategic imperative securand recognizing the impact it has on business outcomes such as customer loyalty and revenue. DigiCert said it’s also important to remember that digital trust awareness is rising among users and customers, meaning that your business success and reputation are directly tied an organization’s ability to ensure digital trust at a high level.


A far-sighted approach to machine learning

The researchers focused on a problem known as multiagent reinforcement learning. Reinforcement learning is a form of machine learning in which an AI agent learns by trial and error. Researchers give the agent a reward for “good” behaviors that help it achieve a goal. The agent adapts its behavior to maximize that reward until it eventually becomes an expert at a task. But when many cooperative or competing agents are simultaneously learning, things become increasingly complex. As agents consider more future steps of their fellow agents, and how their own behavior influences others, the problem soon requires far too much computational power to solve efficiently. This is why other approaches only focus on the short term. “The AIs really want to think about the end of the game, but they don’t know when the game will end. They need to think about how to keep adapting their behavior into infinity so they can win at some far time in the future. Our paper essentially proposes a new objective that enables an AI to think about infinity,” says Kim.


Demand for IT pros remains high even as layoffs continue

Even as layoffs continue, unemployment in the tech sector has remained at near-historic lows, hovering around 2.2%. That compares with the overall US unemployment rate of 3.7% as of October. So far this year, tech industry employment has increased by 193,900 jobs, 28% higher than the same period in 2021, according to a jobs report from CompTIA, a nonprofit association for the IT industry and workforce. “Tech hiring activity remains steady, but there are undoubtedly concerns of a slowing economy,” CompTIA CEO Tim Herbert said in a statement. While November’s job data is not expected to be as robust as the same period a year earlier (when 73,600 jobs were added), the overall projection is that it will remain at a status quo level, with hiring continuing at the same rate as in the last two quarters. “All-in-all, experienced IT Professionals will be in high demand,” Janco said. “Especially those who exhibit a strong work ethic and are results-oriented. Positions that will be in low demand will be administrative and non-line supervisors and managers.”


What I Learned in My First 6 Months as a Director of Data Science

The FAANG companies (Facebook-Apple-Amazon-Netflix-Google) can afford to pay amazing salaries. But most companies hiring data scientists are not like that. Don’t get me wrong! Data scientists still can make a very decent living! But in the world of actual practicing, non-tech data scientists, things are much more realistic. Unfortunately though, it means I am competing for talent against the FAANG companies. As such I have had to get very creative in where I advertise my postings and do my recruiting. Data scientists will always look for jobs at the FAANG companies, but they don’t always think about non-tech companies as employing data scientists. So this means I have learned that I have to be much more proactive in marketing my open roles. LinkedIn is great and recruiters can be helpful. However, I have also found great success in recruiting in unusual online forums — places like Discord, Slack, and Twitter. But make no mistake: recruiting data scientists is a full-contact sport! It is messy. You have to move quickly.


Five Key Components of an Application Security Program

Once an application architecture and design are defined, security risk assessments should be performed that identify and categorize the inherent security risk of the planned application architecture and the application’s expected functional capabilities. These assessments should be inclusive of types of data, business processes, third-party systems and platforms, and/or information infrastructure with which the application will interact and/or to and from which it will store, process, and transmit data. By gaining insight into inherent security risk, appropriate security control objectives and associated security controls can be defined to manage risk appropriately within the applications. Controls can include, but are not limited to, the use of web application firewalls (WAFs) and application program interface (API) security gateways, encryption capabilities, authentication and secrets management, logging requirements, and other security controls. The identification of security instrumentation requirements should also be included in the architecture and design stage of application development. 



Quote for the day:

"Problem-solving leaders have one thing in common: a faith that there's always a better way." -- Gerald M. Weinberg

Daily Tech Digest - November 25, 2022

Ripe For Disruption: Artificial Intelligence Advances Deeper Into Healthcare

The challenges and changes needed to advance AI go well beyond technology considerations. “With data and AI entering in healthcare, we are dealing with an in-depth cultural change, that will not happen overnight,” according to Pierron-Perlès at her co-authors. “Many organizations are developing their own acculturation initiatives to develop the data and AI literacy of their resources in formats that are appealing. AI goes far beyond technical considerations.” There has been great concern about too much AI de-humanizing healthcare. But, once carefully considered and planned, may prove to augment human care. “People, including providers, imagine AI will be cold and calculating without consideration for patients,” says Garg. “Actually, AI-powered automation for healthcare operations frees clinicians and others from the menial, manual tasks that prevent them from focusing all their attention on patient care. While other AI-based products can predict events, the most impactful are incorporated into workflows in order to resolve issues and drive action by frontline users.”


Extinguishing IT Team Burnout Through Mindfulness and Unstructured Time

Mindfulness is fundamentally about awareness. For it to grow, begin by observing your mental state of mind, especially when you find yourself in a stressful situation. Instead of fighting emotions, observe your mental state as those negative ones arise. Think about how you’d conduct a deep root cause analysis on an incident and apply that same rigor to yourself. The key to mindfulness is paying attention to your reaction to events without judgment. This can unlock a new way of thinking because it accepts your reaction, while still enabling you to do what is required for the job. This contrasts being stuck behind frustration or avoiding new work as it rolls in. ... Mindfulness is an individual pursuit, while creativity is an enterprise pursuit, and providing space for employees to be creative is another key to preventing burnout. But there are other benefits as well. There is a direct correlation between creativity and productivity. Teams that spend all their time working on specific processes and problems struggle to develop creative solutions that could move a company forward. 


Overcoming the Four Biggest Barriers to Machine Learning Adoption

The first hurdles with adopting AI and ML are experienced by certain businesses even before they begin. Machine learning is a vast field that pervades most aspects of AI. It paves the way for a wide range of potential applications, from advanced data analytics and computer vision to Natural Language Processing (NLP) and Intelligent Process Automation (IPA). A general rule of thumb for selecting a suitable ML use case is to “follow the money” in addition to the usual recommendations on framing the business goals – what companies expect Machine Learning to do for their business, like improving products or services, improving operational efficiency, and mitigating risk. ... The biggest obstacle to deploying AI-related technologies is corporate culture. Top management is often reluctant to take investment risks, and employees worry about losing their jobs. Businesses must start with small-scale ML use cases that demand realistic investments to achieve quick wins and persuade executives in order to assure stakeholder and employee buy-in. By providing workshops, corporate training, and other incentives, they can promote innovation and digital literacy.


Fixing Metadata’s Bad Definition

A bad definition has practical implications. It makes misunderstandings much more likely, which can infect important processes such as data governance and data modeling. Thinking about this became an annoying itch that I couldn’t scratch. What follows is my thought process working toward a better understanding of metadata and its role in today’s data landscape. The problem starts with language. Our lexicon hasn’t kept up with modern data’s complexity and nuance. There are three main issues with our current discourse about metadata: Vague language - We talk about data in terms of “data” or “metadata”. But one category encompasses the other, which makes it very difficult to differentiate between them. These broad, self-referencing terms leave the door open to being interpreted differently by different people. A gap in data taxonomy - We don’t have a name for the category of data that metadata describes, which creates a gap at the top of our data taxonomy. We need to fill it with a name for the data that metadata refers to. Metadata is contextual - The same data set can be both metadata and not metadata depending on the context. So we need to treat metadata as a role that data can play rather than a fixed category.


Addressing Privacy Challenges in Retail Media Networks

The top reason that consumers cite for mistrusting how companies handle their data is a lack of transparency. Customers know at this point that companies are collecting their data. And many of these customers won’t mind that you’re doing it, as long as you’re upfront about your intentions and give them a clear choice about whether they consent to have their data collected and shared. What’s more, recent privacy laws have increased the need for companies to shore up data security or face the consequences. In the European Union, there’s the General Data Protection Regulation (GDPR). In the U.S., laws vary by state, but California currently has the most restrictive policies thanks to the California Consumer Protection Act (CCPA). Companies that have run afoul of these laws have incurred fines as big as $800 million. Clearly, online retailers that already have — or are considering implementing — a retail media network should take notice and reduce their reliance on third-party data sources that may cause trouble from a compliance standpoint.


For Gaming Companies, Cybersecurity Has Become a Major Value Proposition

Like any other vertical industry, games companies are tasked with protecting their organizations from all nature of cybersecurity threats to their business. Many of them are large enterprises with the same concerns for the protection of internal systems, financial platforms, and employee endpoints as any other firm. "Gaming companies have the same responsibility as any other organization to protect customer privacy and preserve shareholder value. While not specifically regulated like hospitals or critical infrastructure, they must comply with laws like GDPR and CaCPA," explains Craig Burland, CISO for Inversion6, a managed security service provider and fractional CISO firm. "Threats to gaming companies also follow similar trends seen in other segments of the economy — intellectual property (IP) theft, credential theft, and ransomware." IP issues are heightened for these firms, like many in the broader entertainment category, as content leaks for highly anticipated new games or updates can give a brand a black eye at best, and at worst hit them more directly in the financials. 


Driving value from data lake and warehouse modernisation

To achieve this, Data Lakes and Data Warehouses need to grow alongside the business requirements in order to be kept efficient and up to date. Go Reply is a leading Google Cloud Platform Service integrator (SI) that is helping companies that span multiple sectors along this vital journey. Part of the Reply Group, Go Reply is a Google Cloud Premier Partner focussing on areas to include Cloud Strategy and Migration; Big Data; Machine Learning; and Compliance. With Data Modernisation capabilities in the GCP environment constantly evolving, businesses can become overwhelmed and unsure on not only next steps, but more importantly next steps for them, particularly if they don’t have in-house Google expertise. Companies often need to utilise both Data Lakes and Data Warehouses simultaneously so guidance on how to do this, as well as driving value from both kinds of storage is vital. When speaking to the Go Reply leadership team they advise that Google Cloud Platform being the hyperscale cloud of choice for these workloads, brings technology around Data Lake, and Data Warehouse efficiency, along with security superior to other market offerings.


Three tech trends on the verge of a breakthrough in 2023

The second big trend is around virtual reality, augmented reality and the metaverse. Big tech has been spending big here, and there are some suggestions that the basic technology is reaching a tipping point, even if the broader metaverse business models are, at best, still in flux. Headset technologies are starting to coalesce and the software is getting easier to use. But the biggest issue is that consumer interest and trust is still low, if only because the science fiction writers got there long ago with their dystopian view of a headset future. Building that consumer trust and explaining why people might want to engage is just as a high a priority as the technology itself. One technology trend that's perhaps closer, even though we can't see it, is ambient computing. The concept has been around for decades: the idea is that we don't need to carry tech with us because the intelligence is built into the world around us, from smart speakers to smart homes. Ambient computing is designed to vanish into the environment around us – which is perhaps why it's a trend that has remained invisible to many, at least until now.


CIOs beware: IT teams are changing

The role of IT is shifting to be more strategy-oriented, innovative, and proactive. No longer can days be spent responding to issues – instead, issues must be addressed before they impact employees, and solutions should be developed to ensure they don’t return. What does this look like? Rather than waiting for an employee to flag an issue within their system – such as recurring issues with connectivity, slow computer start time, etc. – IT can identify potential threats to workflows before they happen. They plug the holes, then they establish a strategy and framework to avoid the problem entirely in the future. In short, IT plays a critical role in successful workplace flow in both a proactive and reactive way. For those looking to start a career in IT, the onus falls on them to make suggestions and changes that look holistically at the organization and how employees interact within it. IT teams are making themselves strategic assets by thinking through how to make things more efficient and cost-effective in the long term.


A Comprehensive List of Agile Methodologies and How They Work

Extreme Programming (or XP), offers some of the best buffers against unexpected changes or late-stage customer demands. Within sprints and from the start of the business process development, feedback gathering takes place. It’s this feedback that informs everything. This means the entire team becomes accustomed to a culture of pivoting on real-world client demands and outcomes that would otherwise threaten to derail a project and seriously warp lead time production. Any organization with a client-based focus will understand the tightrope that can exist between external demands and internal resources. Continuously orienting those resources based on external demands as they appear is the single most efficient way to achieve harmony. This is something that XP does organically once integrated into your development culture. ,,. Trimming the fat from the development process is what this method is all about. If something doesn’t add immediate value, or tasks within tasks seem to be piling up, the laser focus of Lean Development steps in.



Quote for the day:

"Confident and courageous leaders have no problems pointing out their own weaknesses and ignorance. " -- Thom S. Rainer

Daily Tech Digest - November 24, 2022

The Future Of The Metaverse Is MultiChain - But Its Users Must Be Unchained

It’s an extremely unrealistic scenario that will probably never happen. Every time a shiny new blockchain platform for the metaverse arrives, loaded with promise, it leads to the arrival of new flaws and problems that must be overcome. New chains are made to solve these new issues, and yet more problems arise. And so the cycle of innovation goes on and on. These new blockchains do offer some interesting solutions and benefits, hence they become fertile breeding grounds for numerous developers to experiment with new metaverse projects and decentralized applications. That fuels greater interest in the metaverse, prodding users to further explore the world of Web3 and the multitude of other blockchains that support it. Different chains have different strengths and play host to different metaverse worlds, and users soon understand the need to be able to move across these networks. Mobility across chains is a must, as it is the only thing that enables true metaverse freedom. At present, so-called blockchain bridges are the go-to mechanism for moving digital assets across chains. 


3 Reasons the Cloud is Critical for Ensuring Patient-Centered Care

One of the keys to delivering patient-centered care is gaining a full understanding of the individual and actively engaging them in their outcomes. But for this to become a reality, patients and their providers need access to medical records and open lines of communication matched to patient preferences. The cloud enables centralized access to patient records, giving healthcare providers a full view of the tests, diagnoses, treatments and other patient information. The addition of artificial intelligence (AI) in cloud-based solutions is enhancing care by analyzing vast amounts of patient data and offering insights that aid in clinical decision-making, which can result in faster time-to-treatment and better outcomes. ... In the past, connecting with physicians outside of appointments often turned into a game of phone tag and resulted in multiple voicemails. But with the cloud supporting patient portals, video chat and text messaging, patients and their providers have more ways to communicate. They can quickly exchange information or ask and answer questions electronically at their convenience, and refer back to those messages should they forget something. 


Nearly Half Of CIOs Say Digital Transformations Are Incomplete

In the fully digital era, when employees have the flexibility to work from anywhere, eliminating data silos will be key to improving organization-wide collaboration. Part of the challenge is completely digitizing an organization’s documents and workflows. The other piece is creating a central repository for digital data – one that’s universally accessible to and shared by multiple teams within an organization. According to a recent study by S&P Global Market Intelligence, only 17% of those surveyed said that their organizations had a single source of truth where everyone could access the knowledge they created. If businesses want to realize the benefits that come with fully digital processes, there’s still more work to do. Investing in full digitization, including a unified document system, will make the most of an organization’s data while significantly reducing costs and enhancing collaboration, improving interactions between team members wherever they may be located.


Coding is Dead, Long Live Programming!

No-code platforms usually target industry-specific functions, with their primary audience being non-technical users who wish to optimise their business operations. Low-code platforms, on the other hand, typically target developers who are operating under a time constraint. They are used to deliver applications quickly and conserve resources which can be better utilised to create something that has more impact on the organisation’s bottom line. Since no-code platforms usually target a line of business users, who look to create applications that can be used to expedite their business operations, they provide a host of benefits. Not only do these platforms open up access to creating code-based solutions to non-codes, the feature-rich nature of modern no-code environments allow developers to solve problems unique to their line of work. In addition to this, no-code platforms can also be used to automate workflows, thereby saving more resources. As with any upcoming market, there are challenges associated with no-code and low-code platforms. 


How can business prepare for changes in data legislation?

The role of the Data Protection Office (DPO) is also likely to change, as rules for smaller organisations in particular are loosened as the government tries to create some advantages of Brexit for smaller business which economic data suggests are somewhat thin on the ground right now. The panel all expressed concerns about how DPOs will potentially be replaced with Senior Responsible Individuals (SRIs) who will have the seniority but not necessarily the in depth knowledge necessary for the role. Patrick Burgess, Co-Founder & technical Director of MSP Nutborne Ltd. commented: "Already in the non-enterprise world you often find people are nominated as DPO and they aren't necessarily trained. That person needs to be supported at the highest level or it really is just a box ticking exercise. You have to give people the right powers, responsibilities and training and not get cross when they tell you what you don't want to hear." None of these issues are necessarily going to be resolved by swapping out SRI's for DPOs, although they are, as Kon pointed out, theoretically harder to fire if they sit at board level.


DeveloperWeek Enterprise: Sorting Out Big Data to Empower AI

“In many cases, big data is a big data swamp,” he said in his presentation, “The Big Data Delusion - How to Identify the Right Data to Power AI Systems.” The problem, he said, comes from traditional analytical systems and approaches being applied to outsized amounts of data. For example, an unnamed fintech company that was a customer of EastBanc had huge datasets of its customer data, transactional data, and behavioral data that was cleaned by one team then transferred to another team that enhanced the data. While such an approach may be sufficient, Shilo said it can also slow things down. The fintech company, he said, wanted a way to use its data to predict which of its customers would be receptive to contact. The trouble was it seemed to be a herculean task under traditional processes. “Their current team looked at the task and estimated the effort would take four, five months to complete,” Shilo said. “That’s a lot of time.”


Trends in data centre sustainability

Finch is perplexed by the sudden attention paid to data centre sustainability, given that it’s always been embedded in the DNA of Kao Data, which operates three sites in outer London. Says Finch: “We have always tried to do things in a sustainable manner. It’s as if everybody’s just woken up and smell the coffee. Sustainability has always been in the foundations of Kao Data at its very core.” Sustainability goes hand in hand with reliability, he says. You are not such a hostage to yoyoing energy prices if you use renewables. Also, sustainability reduces both short-term capital expenditure and long-term operational expenditure. Finch says: “Really you end up ticking all the boxes, and it makes complete economic sense to do things in a sustainable manner.” Data centres need to have backup diesel generators in the event of power network outage. Many data centres handle critical Government infrastructure, such as the NHS. One data centre handles Government communications with the nuclear submarine fleet – and infrastructure doesn’t come more critical than that.


Microsoft: Popular IoT SDKs Leave Critical Infrastructure Wide Open to Cyberattack

It took some digging to identify that the Boa servers were the ultimate culprit in the Indian energy-sector attacks, the researchers said. First they noticed that the servers were running on the IP addresses on the list of indicators of compromise (IoCs) published by Recorded Future at the time of the release of the initial report last April, and also that the electrical grid attack targeted exposed IoT devices running Boa, they said. Moreover, half of the IP addresses returned suspicious HTTP response headers, which might be associated with the active deployment of the malicious tool that Recorded Future identified was used in the attack, the researchers noted. Further investigation of the headers indicated that more than 10% of all active IP addresses returning the headers were related to critical industries — including the petroleum industry and associated fleet services — with many of the IP addresses assigned to IoT devices with unpatched critical vulnerabilities. This highlighted "an accessible attack vector for malware operators," according to Microsoft.


India drafts new privacy bill for transfer of personal data internationally

"Cross-border interactions are a defining characteristic of today’s interconnected world," according to an explanatory note from the government accompanying the bill. "Recognising this, it has been provided in the bill that personal data may be transferred to certain notified countries and territories." ... “The Central Government may, after an assessment of such factors as it may consider necessary, notify such countries or territories outside India to which a Data Fiduciary may transfer personal data, in accordance with such terms and conditions as may be specified,” according to the draft. A data fiduciary, according to the draft, could be any person or a group of persons who determines the purpose and means of processing personal data. The draft Digital Personal Data Protection Bill, for which the ministry of electronics and information technology has invited feedback from the public via a portal till December 17, also lays out the exemptions and conditions that must be considered when considering the transfer of personal data to other nations.


Examining low-code/no-code popularity across Africa and its range of disruption for CIOs

Dagadu appreciates he could do without a developer to use these tools even though he initially used one, which incurred a lot of cost. “At first we had to employ a web developer who did a lot of work, but it looks horrible using technology like Square Space,” he says. “In Africa, development costs are very high and can only be tackled by companies with a lot of funding. But with the growth of low-code/no-code, more people with bright ideas can bring them to life without the need for expensive developers.” He noted that because of the popularity problem in Africa of these tools, people believe that every time they have an idea to implement an application or technology, they have to resort to an application developer. But by coding less or not at all, there’s an easier entry into hard code according to WenakLabs’ Assani. “It’s a way to be visible quickly, to offer your services to the world without resorting to the skills of a developer. Above all, you learn through experimentation.”



Quote for the day:

"And how does one lead? We lead by doing; we lead by being." -- Bryant McGill

Daily Tech Digest - November 23, 2022

What's coming for cloud computing in 2023

Enterprises often move to multicloud on purpose, but way more often multicloud just happens as enterprises strive to find and leverage best-of-breed cloud services with no plan for what to do with those services after deployment. This leads to too much cost and not enough return of value to the business. Old story. This cloud complexity problem can be solved through the strategic use of technology and better approaches to manage the complexity. Most important is reducing redundancy by using a common layer of technology above the public cloud providers as well as above any legacy or edge-based systems. This layer includes common services, such as a single security system, a single data management system, finops, a single cloud operations system, etc. We’re not attempting to solve every problem within the “walled garden” of each public cloud provider; this technology should exist within a common layer, aka supercloud or metacloud. This strategic cloud trend not only solves the complexity problems by leveraging common services and a common control plane, it also helps get cloud costs under control through a common finops layer that handles cost monitoring, cost governance, and cloud cost optimization.


Best practices for implementing a company-wide risk analysis program

The first step is determining what is critical to protect. Unlike accounting assets (e.g., servers, laptops, etc.), in cybersecurity terms this would include things that are typically of broader business value. Often the quickest path is to talk with the leads for different departments. You need to understand what data is critical to the functioning of each group, what information they hold that would be valuable to competitors and what information disclosures would hurt customer relationships. Also assess whether each department handles trade secrets, or holds patents, trademarks, and copyrights. Finally, assess who handles personally identifiable information (PII) and whether the group and its data are subject to regulatory requirements such as GDPR, PCI DSS, CCPA, Sarbanes Oxley, etc. When making these assessments, keep three factors in mind: what needs to be safe and can’t be stolen, what must remain accessible for continued function of a given department or the organization, and what data/information must be reliable (i.e., that which can’t be altered without your knowledge) for people to do their jobs.


What Is Data Virtualization?

The process of data virtualization is quite simple. Data is accessed in its original form and source. Unlike typical “extract, transform, and load” (ETL) processes, virtualization doesn’t require data to be moved to a data warehouse or data lake first. Data is aggregated in a single location, known as a virtual data layer. Using this layer, enterprises can develop simple, holistic, and customizable views (also known as dashboards) for accessing and making sense of data. Using these tools, users can also pull real-time reports, manipulate data, and perform advanced data processes such as predictive maintenance. Data is easily accessible via dashboards from anywhere. ... While data is critical to the decision-making process, not just any data will do. The data used must be accurate, up-to-date, and logical. It must also be displayed in a way that all stakeholders can understand, whether a user is a data scientist or a C-level executive. Data virtualization enables stakeholders to access the specific data they need when they need it. Because data isn’t just a replication from any given time, all data is accurate to the minute. 


LockBit 3.0 Says It's Holding a Canadian City for Ransom

LockBit operators posted screenshots showing files of different departments and other data as a proof for their claim, but Information Security Media Group was unable to immediately contact the municipality and confirm the authenticity of the documents. The attack comes on the heels of a new National Cyber Threat Assessment 2023-2024 by the Canadian Center for Cyber Security. The report, which says ransomware is "the most disruptive form of cybercrime facing Canadians," adds that ransomware benefits significantly from the specialized cybercrime economy and the growing availability of stolen information. "So long as ransomware remains profitable, we will almost certainly continue to see cybercriminals deploying it," the report says. The city of Westmount's official website was not affected by the attack, and the municipality says any updates on the recovery will be communicated on the site. The mayor assured residents that data security is its "top priority" and so "is the protection of our residents' and employees' information."


A brief history of industrial IoT

Most early networking technologies were wired: Connection required cables that physically linked your device to the network. Network bandwidth — the amount of data that can be conveyed in a period of time — for 10BASE-T Ethernet connections, one of the most widely used standards established in the late 1980s and early 1990s, allowed for as much as 10 Megabits of data per second. In contemporary times, wired networks support connections of 1,000 Megabits of data per second (1000BASE-T or 1 Gigabit) or even 10 Gigabits of data per second (10GBASE-T) for modern Ethernet connections. Wireless and cellular networking, which eliminated the need for a cable to each device, was a significant shift for IIoT. Standardized in 1999, 802.11b was one of the first standards supported in products from many manufacturers and was a predecessor to the Wi-Fi 6E standard established in 2020. Modern Wi-Fi devices not only offer speeds anywhere from 50 to 800 times as fast as earlier equipment, but the devices may also perform reliably in much more dense radio environments than their predecessors.


How to Avoid Risks Before Implementing Industrial IoT Solutions

Industrial IoT solutions are often implemented at Enterprises with a high proportion of machine manufacturing. For a well-funded company, it is often easier to implement the IoT ecosystem using modern equipment. But for some, it would be too expensive to replace legacy manufacturing systems. Therefore, companies often choose to adapt existing equipment and enhance it with sensors, smart devices, and gateways. However, when choosing to implement IoT technology in an enterprise equipped with old machines, the company has to ensure protocols are understandable for all the devices to connect disparate data stores, and solve all the compatibility issues. According to McKinsey, a company moving to EIoT has to solve compatibility issues for about 50% of all devices. If compatibility issues are not solved appropriately, the solution may not function as intended, or even at all. The wrong algorithm or incorrect integration can lead to hardware malfunctions and equipment damages, overheating, explosion, or system failure. 


How remote working impacts security incident reporting

The risks of an impeded reporting process due to remote working are significant. When incidents go unreported, reports are delayed/miscommunicated or follow-up actions/responses are hindered, it can leave vulnerabilities exposed and/or buy attackers time in the system to infiltrate more of the network before the security team can detect and contain threats and malicious activity, Chavoya warns. This can not only exacerbate the severity of incidents and attacks but can also damage both the reputation of a business and its ability to meet certain data protection regulations which stipulate strict rules surrounding disclosure. These could lead to loss of customer confidence and large monetary penalties. It is therefore paramount for security teams to update their reporting policies and processes to account for the security implications of remote working. “The home and hybrid working trend is here to stay, so it is incredibly dangerous for security teams to rely on policies and processes designed for a bygone era when most, if not all, employees were based in a controlled office environment,” says Holyome. 


IT leadership: 5 ways to create a culture of gratitude

Expressing gratitude is an integral part of a healthy culture. I think it starts with a leader maintaining healthy personal humility and respect and empathy for their staff, so that gratitude is coming from a genuine place. Thank-yous should be prompt, specific, and connect the accomplishment to its impact on our mission of educating students. Thanking a team for finishing a project, as in: “Your team successfully implemented this project, which I really appreciate” is more powerful when it adds, “The new UI will help our students better determine what classes they still need to take in order to graduate.” It’s helpful to give customer feedback as well, such as “I talked with an adviser who says this will really help her more accurately advise students.” IT teams always see a steady stream of problem tickets, so hearing how their work is impacting students and faculty, and/or hearing verbatim feedback from delighted users, can be very encouraging. In addition to thanking employees individually, department emails and all-staff meetings and parties should all include recognition and gratitude for recent accomplishments, and a little free food and swag never hurts, either.


5 pitfalls to avoid when partnering with startups

For Bedi, it came as a rude shock when he found out a startup he was working with on a project didn’t have an internal development team and instead relied on a third party for its deliverables. “We had partnered with a startup on a customer onboarding project. A delay of 15 to 20 days is acceptable but alarm bells ring when there is a significant overrun of timelines. In our case, there was a delay of more than two months,” says Bedi. “Not only a lack of bandwidth but also the brief that the startup receives from the enterprise and passes to the third party gets lost in translation. It doesn’t help that the startup didn’t read the detailed business requirements document.” Unfortunately, it’s tough to cut this risk altogether, Bed says. “There are few IT leaders who verify the credentials of a startup to the extent of asking the CVs of their team members. Even if some do so, some startups resort to ‘body shopping,’” he says, referring to the practice of recruiting workers to contract their services out on a tactical short- to mid-term basis. So, what’s the way out? The best approach is to open a clear line of communication with the startup and ensure transparency. 


Implications of Emerging Technology on Cyber-Security

Proper understanding of the new technologies is very important; this includes risk assessment and evaluation of the new technology, followed by proper planning for implementation and risk mitigation. Risks are changing much faster than organisations can mitigate them. Unfortunately, there is no silver bullet for cyber-security, but there are three areas that must be carefully planned: Organizations must ensure they understand the risks of any new technology they install, as this will be key to properly securing it. As a result, training and education on the new technology is a cornerstone to build on, and this is not just for technology people but for everyone involved who works with critical data and new technologies. Although ultimate accountability will still rest with the organization’s senior management, the information security team has the responsibility to study the new technology well and evaluate the associated risks. The primary goal is to foster an organisational culture that encourages both risk-based decision making and innovation and new technology adoption.



Quote for the day:

"Leadership does not always wear the harness of compromise." -- Woodrow Wilson