Daily Tech Digest - February 13, 2023

Mergers and Acquisitions in Healthcare: The Security Risks

Incidents such as the CommonSpirit ransomware attack highlight the critical importance for entities to carefully assess and address potential IT security risks involving a potential merger or acquisition, experts say. "We are seeing that well-established health systems or entities that have very mature cybersecurity programs take on an entity which is less secure," says John Riggi, national adviser for cybersecurity and risk at the American Hospital Association. The association advises hospital mergers to treat cyber risk with the same priority as financial analysis in a merger. But determining and identifying the array of systems and myriad of devices used by another healthcare entity that's being acquired is not easy. "When you buy an organization, you typically don't know everything you're buying," says Kathy Hughes, CISO of New York-based Northwell Health, which has 21 hospitals and over 550 outpatient facilities, many of which were acquired by the organization, which is the result of a 1997 merger between North Shore Health System and Long Island Jewish Medical Center.


Forget ChatGPT vs Bard, The Real Battle is GPUs vs TPUs

Solving for efficient matrix multiplication can cut down on the amount of compute resources required for training and inferencing tasks. While other methods like quantisation and model shrinking have also proven to cut down on compute, they sacrifice on accuracy. For a tech giant creating a state-of-the-art model, they’d rather spend the $5 million, if there’s no way to cut costs.  ... NVIDIA’s GPUs were well-suited to matrix multiplication tasks due to their hardware architecture, as they were able to effectively parallelise across multiple CUDA cores. Training models on GPUs became the status quo for deep learning in 2012, and the industry has never looked back. Building on this, Google also launched the first version of the tensor processing unit (TPU) in 2016, which contains custom ASICs (application-specific integrated circuits) optimised for tensor calculations. In addition to this optimisation, TPUs also work extremely well with Google’s TensorFlow framework; the tool of choice for machine learning engineers at the company.


As Digital Trade Expands, Data Governance Fragments

The upshot is that we are still far from any more global efforts. Even preliminary convergence on national laws about data protection and privacy between the United States and the European Union is difficult to achieve. Instead, Aaronson advocated for the establishment of a new international organization that could provide proper incentives to, and pay, global firms to share data. Overall, the panellists urged that technical discussions of data flows, data governance and rules for digital trade be contextualized within fundamental concerns about the nature of data and the role of human rights. These concerns equally require attention and governance. The discussion on effective digital governance requires a fundamental rethink of the nature of data. As emphasized by panellist Kyung Sin Park, data embeds fundamental human freedoms and human information. It is closely linked to human rights. Data is much more than an economic asset used in training artificial intelligence (AI) algorithms.


Fall in Love with the Problem, Not the Solution: A Handbook for Entrepreneurs

Think of a problem—a big problem, something worth solving, something that would make the world a better place. Ask yourself, who has this problem? If you happen to be the only person on the planet with this problem, then go to a shrink. It’s much cheaper and easier than building a startup. But if a lot of people have this problem, go and speak with those people to understand their perception of the problem. Know the reality, and only then start building the solution. If you follow this path and your solution works, it’s guaranteed to create value. But there is a more important part to this. Imagine speaking with people and their feedback is, yeah, go ahead and solve that for me—this is a big problem. All of a sudden you feel committed to this journey. You essentially fall in love with the problem. Falling in love with the problem dramatically increases your likelihood of being successful because the problem becomes the north star of your journey, keeping you focused.


Data Mobility Framework: Expert Offers Four Keys to Know

It’s common for hybrid work teams to schedule when employees will be in the office and when they’ll work remotely. But while remote workers don’t always work from the same home office, they do expect similar access to business data and applications regardless of the network or device they’re using—and all of this remote connectivity has a material impact on data storage demands. Organizations try to balance data storage initiatives to address this without causing downtime to mission-critical applications and data. The faster organizations can add new storage or move data non-disruptively to another location, the better services they can deliver to end-users. Thankfully, the right data migration partner can perform these critical services non-disruptively in a matter of hours. This enables the organization and its partners to access a range of capabilities to minimize data migration efforts, including being able to migrate “hot data” to a new, more powerful array without downtime. Hot data is any data that is in constant demand, such as a database or application that’s essential for your business to operate.


Stop Suffocating Success! 7 Ways Established Businesses Can Start Thinking Like a Startup.

Startups aren't trapped by old rules—they're in the process of inventing themselves. Obviously, established companies can't just completely throw out the rulebook. But remember rules should exist to help, not just because they've always been there. Otherwise, people wind up blindly following often annoying processes without thinking about the end goal. For example, if multiple clients ask for a product feature that hasn't been included, but there isn't a feature review meeting until the next quarter, does it make sense to follow the rules and wait? Or should staff be empowered to add the feature (or, at least, fast-track a product review)? Beware of any policy that exists because "We've-always-done-things-this-way." ... Incompetent workers can take a terrible toll. To start, everything's harder when the people around you don't carry their weight. It's also demoralizing—you're working so hard and hitting all your goals, while the person next to you fails spectacularly and apparently isn't penalized for it. Over time, you're likely to grow bitter or just stop trying so hard since results clearly don't matter.


The Stubborn Immaturity of Edge Computing

Of course, they don’t even think of it as “the edge”. To them, it’s where real work takes place. So when IT vendors and cloud providers and carriers talk about the “far edge” (where real customers and real factories and real work takes place), that makes no sense to people outside of IT vendors’ data-center-centric bubble. The real world doesn’t revolve around the data center, or the cloud. What’s really far in the real world? The cloud. The data center. Edge computing is a technology style that’s part of a digital transformation trend. Digital transformation has been on a march for decades, well before we called it that. It’s accelerated because of cloud computing, and global connectivity. A lot of the technology transformation has been taking place at the back-end. In data centers, in business models. And there’s a lot left to be done. But the true green field in digital transformation is where people and things and factories actually exist. (OK, we’ll call that the “edge”, but that’s such an old IT-centric way of talking!)


How the Future of Work Will Be Shaped by No Code AI

No-code, like other breakthroughs, is a thrilling disruption and improvement in the software development process, particularly for small firms. Among its various applications, no-code has enabled users with little technical experience to create applications using pre-built frameworks and templates, which will undoubtedly lead to further inventions and design and development in the digital town square. It also cuts down on software development time, allowing for faster implementation of business solutions. Aside from the time saved, no-code can enhance computer and human resources by transferring these duties to software suppliers. ... No-code is also a game changer for many AI technology developers and non-technical people since it focuses on something we never imagined possible in the difficult field of artificial intelligence: simplicity. Anyone will be able to swiftly build AI apps using no-code development platforms, which provide a visual, code-free, and easy-to-use interface for deploying AI and machine learning models.


Code Readability vs Performance: Here is The Verdict

Code performance is critical, especially when working on projects that require high-speed computation and real-time processing. This can result in slow and sluggish user experiences. But focusing on the performance of a code that is not readable is useless. Moreover it can also be prone to bugs and errors. Performance is a quirky thing. Starting to write a code with performance as the first priority is not a path that any developer would take, or even recommend. In a Reddit thread, a developer gives an example of a code that compiles in 1 millisecond, and the other code in 0.1 millisecond. No one can really notice the difference between both the models as long as the code is “fast enough”. So improving the performance and focusing on it, while sacrificing the readability of the code can be counterproductive. Moreover, in the same Reddit thread, another developer pointed out that writing faster algorithms actually requires you to write harder code oftentimes, which again sacrifices the readability. 


LockBit Group Goes From Denial to Bargaining Over Royal Mail

LockBit's about-face - "it wasn't us" to "it was us" - is a reminder that ransomware groups will continue to lie, cheat and steal, so long as they can profit at a victim's expense. Isn't hitting a piece of Britain's critical national infrastructure - as in, the national postal service - risky? After DarkSide hit Colonial Pipeline in the United States in May 2021, for example, the group first blamed an affiliate before shutting down its operations and later rebooting under a different name. While hitting CNI might seem like playing with fire, many security experts' consensus is that ransomware groups' target selection remains opportunistic. Both operators and any affiliates who use their malware, as well as the initial access brokers from whom they often buy ready-made access to victims' networks, seem to snare whoever they can catch and then perhaps prioritize victims based on size and industry. What's notable isn't necessarily that LockBit - or one of its affiliates - hit Royal Mail, but that it decided to press the attack. 



Quote for the day:


“None of us can afford to play small anymore. The time to step up and lead is now.” -- Claudio Toyama

Daily Tech Digest - February 11, 2023

How Modern Enterprise Architecture Drives Enterprise Success

The need for cross-functional, distributed technology ownership is the perfect ecosystem for the enterprise architect to shine. Traditionally, EA has governed and exerted control over technology. However, as this becomes decentralized across the business and, thus, influenced by EAs, innovations become collaborative rather than authoritarian. Leaders can leverage the Enterprise Architect in their change initiatives. By pooling data across departments, executives can support their assertions on which areas of the business would benefit most from a change in design. Modeling projects means they can be compared simultaneously, helping to uncover which strategy will yield the greatest returns. EA-driven roadmaps can help leaders consider KPIs a year or ten years into the future of the business, made more solid by knowledge distilled directly from those closest to the respective tools and processes they use. Technology acquisition and skill dependencies cross product boundaries, regulatory compliance processes intersect many different processes, enterprise-wide cost objectives span multiple siloes, and cybersecurity threats could rear up anywhere.


What Is Augmented Data Management?

Augmented master data management applies ML and AI techniques to the management of master data. This enables companies to refine master data to achieve two key objectives: to optimize their business operations to run more efficiently and transform their businesses to drive growth. In terms of business optimization, this can be achieved in several ways using augmented master data management. First, there’s enhanced efficiency. With augmented master data management, companies can streamline business processes, which reduces the time needed to work on activities, thereby increasing efficiency and potentially leading to cost savings. Augmented master data management can also improve the ability to comply with regulations. The number of regulations and demands on companies has been increasing; for example, we’ve seen this lately with ESG reporting and privacy laws. From a data perspective, it can be tedious and complex for companies to comply with these regulations.


Data is a stumbling block for most multicloud deployments

Migrating data from one cloud service to another can be challenging. It is important to have a solid data portability strategy in place that considers data format, size, and dependencies. Most of those moving to multicloud can’t answer this question: “What would it take to migrate this data set from here to there?” This needs to be in your back pocket, as we’re seeing some data sets move from single and multicloud deployments back to on premises. You must give yourselves options. ... Managing data across multiple cloud services can be a resource-intensive task if you attempt to do everything manually. It is essential to have a centralized data management system in place that can handle diverse data sources and ensure data consistency. Again, this needs to be centralized, abstracted above the public cloud providers and native data management implementations. You need to deal with data complexity on your terms, not the terms of the data complexity itself. Most are opting for the latter, which is a huge mistake.


Understanding the Role of CIOs in Test Data Management

A zero-trust framework is a cyber security approach wherein all users and systems are not trusted and under authentication before granting access. Only users who are verified by the protocol get access to the designs. It is a great leap over traditional cybersecurity models that primarily operate on assumptions. To fully achieve the zero trust frameworks, automation of test data through a DevOps Platform is performed. Given the responsibility of implementing cyber security throughout the enterprise, a CIO is influential here. The leader has to roll out the DevSecOps approach that integrates cyber security from the beginning. The real challenge here is to build a culture of not treating security as an afterthought. It should be a part of the SDLC and a CIO should educate all stakeholders about the same. With DevSecOps, they work towards pipelining the DevOps pieces with security protocols. A CIO has to upgrade an enterprise’s approach towards risk and security in test data and perfect the delivery pipeline of qualitative data sets for different environments. 


A step-by-step guide to setting up a data governance program

Data governance is a crucial aspect of managing an organization’s data assets. The primary goal of any data governance program is to deliver against prioritized business objectives and unlock the value of your data across your organization. Realize that a data governance program cannot exist on its own – it must solve business problems and deliver outcomes. Start by identifying business objectives, desired outcomes, key stakeholders, and the data needed to deliver these objectives. Technology and data architecture play a crucial role in enabling data governance and achieving these objectives. Don’t try to do everything at once! Focus and prioritize what you’re delivering to the business, determine what you need, deliver and measure results, refine, expand, and deliver against the next priority objectives. A well-executed data governance program ensures that data is accurate, complete, consistent, and accessible to those who need it, while protecting data from unauthorized access or misuse.


Why Businesses Need To Think Bigger When It Comes To Automation

Far too often, automation projects are approached as siloed, one-off opportunities to fine-tune a specific business process or function. Maybe it's the introduction of a conversational AI tool to improve front-line customer support functions or the development of a new payment processing or credit decisioning solution to build out a digital payments infrastructure. Whatever the specific use case, automation projects that approach a singular vision to make one part of the business faster or more efficient regularly fail because they simply aren't big enough. To really extract value from automation, these projects need to start with an enterprise-wide vision and break down the walls between data, analytics, digital and operational teams to redefine business processes across multiple functions. Put simply, businesses need to stop thinking about automation and start focusing on hyperautomation. Businesses that understand this distinction and embrace automation not as a focused cost-cutting project but as an opportunity to transform legacy business and IT processes into a fully synchronized, smart workflow should be well positioned to confront the challenges of the current marketplace.


Surge of swatting attacks targets corporate executives and board members

The way it works in this new corporate swatting surge is that the malicious actors go to the websites of corporations, identify the top executives and board members, and with lists in hand, visit the websites of data brokers such as 411.com, Spokeo, and others. While there, the swatters grab whatever they can – names, addresses, phone numbers, email addresses, whatever is available. It is a "one-stop shop for finding the locations of executives and corporate officers," says Pierson. Alternatively, the threat actors plumb the archives of content aggregated from thousands of data breaches over the years. The swatter can easily find out that an executive "ordered new jogging shorts or whatever" and where those shorts were shipped, he says. Once the cybercriminals have that information, they do one of two things: use synthesized voice devices or make robotic recordings and call the police. The messages generally focus on a hostage or murder situation. 


Governance, Processes and Planning: Three Significant Countermeasures to Being Hacked

Together, governance, processes and planning help organizations to effectively manage and protect their digital assets by providing a clear framework for decision-making, establishing clear procedures for incident response and risk management, and developing a comprehensive security strategy that aligns with the organization’s overall goals and priorities. Through governance, processes and planning, your organization can start to fix the people vulnerability. So how can your organization develop and implement governance, processes and planning countermeasures? ISACA’s CMMI Cybermaturity Platform is a great place to start. The CMMI Cybermaturity Platform will help your organization identify what it does well and where your weaknesses are. The CMMI Cybermaturity Platform also aids your organization in showing where your gaps are in governance, processes and planning, three often overlooked critical countermeasures to hacking. The CMMI Cybermaturity Platform is an easy-to-use architecture model that simplifies identifying gaps in new or existing cybersecurity programs.


What is predictive analytics? Transforming data into future insights

Predictive analytics makes looking into the future more accurate and reliable than previous tools. As such it can help adopters find ways to save and earn money. Retailers often use predictive models to forecast inventory requirements, manage shipping schedules, and configure store layouts to maximize sales. Airlines frequently use predictive analytics to set ticket prices reflecting past travel trends. Hotels, restaurants, and other hospitality industry players can use the technology to forecast the number of guests on any given night in order to maximize occupancy and revenue. By optimizing marketing campaigns with predictive analytics, organizations can also generate new customer responses or purchases, as well as promote cross-sell opportunities. Predictive models can help businesses attract, retain, and nurture their most valued customers. Predictive analytics can also be used to detect and halt various types of criminal behavior before any serious damage is inflected. 


5 top workforce concerns for CIOs

Mental health has become a critical concern in the workplace, particularly as the global workforce resets following the COVID-19 pandemic. The prolonged stress and uncertainty of the past several years, coupled with a shaky economy, has taken a toll on many employees, and they are looking for support and resources to help them cope. A key part of supporting employee mental health is eliminating the stigma around taking time to recharge mentally. ... High employee engagement is an ongoing objective for organizations, as engaged employees are more likely to be productive, motivated, and committed to their work. However, employee engagement has steadily declined throughout the Great Resignation, “quiet quitting,” and recent tech layoffs. To boost engagement, focus on creating a positive and supportive work environment, fostering open communication, and providing opportunities for employee growth and development. Also, seek feedback from employees to understand their needs and concerns and work to address them.



Quote for the day:

"Great leaders go forward without stopping, remain firm without tiring and remain enthusiastic while growing" -- Reed Markham

Daily Tech Digest - February 10, 2023

Getting Started with Design Thinking for Developers and Managers

More visual, intuitive types of designers do still sometimes struggle working with developers; they expect developers to just accept their intuitive conclusions. But developers in general don't go for touchy-feely intuitive design. They want more logical reasons for design choices. Besides, true design thinking goes beyond intuition, leveraging measurement and analysis too. It's therefore generally a bad idea for a more intuitive, visually oriented designer to lead a design team containing developers. While those designers are valuable members of a design thinking team, their disconnect with developers and their bias towards intuition over analysis means they should not be running the show. ... Visual designers have a vested interest in fostering that impression. Some developers are happy to go along with it because it gives them an excuse to delegate any responsibilities concerning design.It's also a simple and tangible concept for managers to get their heads around. The visual nature is something they can see immediately. More sophisticated forms of design take more effort to understand.


How Cybercriminals Are Operationalizing Money Laundering and What to Do About It

Financial institutions, cryptocurrency companies, and other organizations face increasing fines — sometimes ranging in the millions and billions of dollars — for failure to root out money laundering as government agencies and regulators worldwide seek to crack down on this scourge. ... A preferred tactic by cybercriminal organizations looking to grow their ranks is to use what are known as money mules. These are individuals who are brought in to help launder money — sometimes, unknowingly. They're often lured in under false pretenses and promises of legitimate jobs, only to discover that "job" is to help launder the profits from cybercrime. Back in the day, this money shuffling was typically done through anonymous wire transfer services. While they often got away with it, such transfers are far easier for law enforcement and regulators to track. These days, most criminals have moved to using cryptocurrency. Its relative lack of regulatory oversight, coupled with often-anonymous transactions, make it almost the ideal vehicle for money laundering. 


Solving Problems With The IoT

Despite security concerns, the IoT is so useful that it continues to by leaps and bounds — so much so that when ChatGPT, a new AI search engine, was asked to list the top 100 applications for the IoT, the search engine simply added the word “smart” in front of many common places and items. For example, it responded with “smart aquariums, smart theme parks, smart libraries,” etc. Put simply, the IoT is everywhere. What makes it so popular is its ability to solve problems. For instance, safety is critical in manufacturing, industrial, chemical processing, mining, and many other applications. IoT sensors can be used to monitor environments for the presence of hazardous chemicals. If there is a gas leak, a real-time alert can be sent to the control centers to prevent potential accidents from occurring. In addition, aging infrastructure such as bridges, buildings, highways, and power grids pose risks. To help mitigate these risks, sensors in an IoT network can track cement movement and the changing size of cement cracks. IoT monitoring of the moisture in some building structures can provide advance warning of potential disasters such as collapsing buildings and bridges.


MoD issues revised cloud strategy as it prepares to move top-secret data off-premise by 2025

The Department will be pursuing a multi-cloud approach to sourcing these off-premise capabilities, because no one supplier will be able to address the “complexity of Defence’s requirements” nor its “evolving ambition” or scalability demands, according to the document. “By 2025, the services required by game-changing military capabilities will be available across Defence, accelerating our level of cloud consumption,” the document continued. “We will take advantage of evergreen solutions to prevent future obsolescence, and to ensure immediate access to the latest technologies, driving the pace of modernisation. “By 2025, we will use cloud platforms as the foundation on which to build capabilities in big data, advanced analytics, automation and synthetics. We will spend the majority of our compute expenditure investing in strategic modern platforms, rather than maintaining obsolete legacy platforms.” Elsewhere in the document, the organisation said its aim is to be “cloud-native” as much as possible, with members of the Defence community encouraged to take an MODCloud-first approach to procuring services.


Google’s AI chatbot is out to rival ChatGPT

While Bard is still in its early stages of development, Google is confident that the system will be able to compete with ChatGPT and other AI systems in the market. Apart from assisting with search engine capabilities, Bard will bring other features that will assist developers in developing their applications using Google’s language model. “Beyond our own products, we think it’s important to make it easy, safe and scalable for others to benefit from these advances by building on top of our best models,” Pichai wrote. “Next month, we’ll start onboarding individual developers, creators and enterprises so they can try our Generative Language API, initially powered by LaMDA with a range of models to follow. Over time, we intend to create a suite of tools and APIs that will make it easy for others to build more innovative applications with AI.” For other end users, there has been a mixed reaction regarding how AI chatbots will affect the order of things. While some people argue that the advent of these chatbots and their potential integration into search engines will aid the creative and marketing industries, others think otherwise.


Yes, CISOs should be concerned about the types of data spy balloons can intercept

Nation-states will collect intelligence to further their knowledge of rivals and a large part of that intelligence will come from private corporations. The fact that China chose this particular time to do so is indicative of its desire to place the United States in a weakened position ahead of a planned visit to China by US Secretary of State Antony Blinken, if it could. The United States didn’t take the bait and postponed the visit indefinitely and sent a demarche to the government of China. The “sources tell us” snippets from the mainstream media note that the United States purposefully allowed the balloon and its collection platform to continue its mission and to receive navigational commands but jammed the transmission of non-navigational signals. Thus, it is probable that the Chinese tried to issue a destruct command (not unlike those any CISO can do for a lost iPhone) but were unable to do so due to US countermeasures. Regardless of the outcome of that technological duel in the sky, the containers will provide valuable intelligence. 


10 Tips for Developing a Data Governance Strategy

There is no “one” right data governance leader. In some companies, the data governance leader is the chief data officer. In others, it may be the CFO, chief risk officer, or CIO. Historically, the role has resided within the realm of IT. Today, that’s changing. A Forrester study found that 45% of companies make data governance mostly business-focused, while 53% are IT-focused. Forrester advises that data governance is more a business problem and should be anchored in a business context. No matter which office heads up the data governance strategy, the team should be spread throughout the company, incorporating subject-matter or line-of-business experts, data analysts, data scientists, the IT department, and legal counsel. “What we’ve done wrong in the past is taken a role and turned it into a position, versus thinking about how we use data, build insights, and make decisions from our data,” Goetz said. “If you can see how you operate as a culture, you can figure out who should own it in the company.”


Cyber Insurance Costs Lead to Scrutiny of Business Partners

“Many suppliers to large companies often are small businesses that lag behind in their deployment of cybersecurity controls. They can be an easy path for cyber criminals to launch attacks on larger organizations,” she says. “This additional risk needs to be considered when pricing cyber coverage and has an impact on cyber insurance premiums.” She explains that having adequate cybersecurity deployed when interacting with third-party vendors drastically improves the risk profile of any organization. “It also makes it more insurable for cyber, which in return lowers premiums or opens more coverage options,” Dumont adds. This approach by larger businesses ranges, for example, from compliance to security best practices when deploying cloud providers and requiring multi-factor authentication (MFA) for maintenance services when they access the company’s connected equipment. From her perspective, third-party scrutiny on cybersecurity yields positive outcomes for all, starting with the most important benefit, which is to lower the likelihood of facing a cyber incident.


Seven deadly sins of devising a cloud strategy

Relying on a single vendor to implement a cloud strategy is an inflexible approach that leaves enterprises isolated when it comes to maintaining control over the performance of their digital platform. It can mean having little or no say in which services and providers can be adopted while being locked-in to lengthy service agreements, even when prices rise, or when service levels fall off. This is particularly pertinent given the dramatic reduction in the cost of cloud services in recent years. ... Losing track of costs is easily done when implementing a cloud strategy, especially in cases when the scale of the transformation is significant. It’s imperative to identify areas where resources are being mismanaged and then eliminate waste. For example, in a sector such as financial services, which has traditionally been slow to adopt cloud computing, taking a “rightsizing” approach will help identify areas that have not been provisioned correctly. They can then be reconfigured to optimal levels. In practice, this means only purchasing cloud services that a business actually needs and that it will use.


Secure Delivery: Better Workflows for Secure Systems and Pain-Free Delivery

When reviewing architecture at a high level, any security concerns are usually big-ticket items that require considerable effort to retrofit, and sometimes even the redesign of a critical feature of a system like authentication. Lower-level threats and vulnerabilities are often found by outsourcing deeper technical security knowledge from an external penetration testing company, who are engaged to attack the system and highlight any serious issues. After these activities are complete, we usually see a fractious negotiation around risk and resources, with the engineering team pushing back on making expensive, time-consuming changes to their system architecture and operational processes just before their release deadlines, and the system owner pushing for risk acceptance for all but the most serious risks. Overall, security can be seen as something that’s owned by the security team and not an attribute of a system’s quality that’s owned by engineers, like performance or reliability.



Quote for the day:

"Leadership - leadership is about taking responsibility, not making excuses." -- Mitt Romney

Daily Tech Digest - February 09, 2023

The role of the database in edge computing

In a distributed architecture, data storage and processing can occur in multiple tiers: at the central cloud data centers, at cloud-edge locations, and at the client/device tier. In the latter case, the device could be a mobile phone, a desktop system, or custom-embedded hardware. From cloud to client, each tier provides higher guarantees of service availability and responsiveness over the previous tier. Co-locating the database with the application on the device would guarantee the highest level of availability and responsiveness, with no reliance on network connectivity. A key aspect of distributed databases is the ability to keep the data consistent and in sync across these various tiers, subject to network availability. Data sync is not about bulk transfer or duplication of data across these distributed islands. It is the ability to transfer only the relevant subset of data at scale, in a manner that is resilient to network disruptions. For example, in retail, only store-specific data may need to be transferred downstream to store locations.


Data management in the digital age

To ensure effective data management, organisations can adopt various strategies and tactics that have proven their worth in modern organisations. The first of these is a comprehensive risk assessment. Performing risk assessments regularly will ensure that you can identify and prioritise vulnerabilities before they become gaping security holes that can be exploited. Ongoing risk assessments should be bolstered by robust and current security and data management policies that reflect the threat landscape. “You also need to implement employee training and communication because humans are often the weakest link in even the most advanced security system,” says Grimes. “You must ensure that security is understandable and accessible and that the lessons are driven home through constant reminders and training programmes. All it takes is one click to bring down the most sophisticated security system on the planet.” It’s also important to collaborate with vendors and partners that understand the security landscape and have the tools and expertise required to support the organisation’s security posture. 


Coaching IT pros for leadership roles

You can teach someone to code, manage money, and complete the tasks of being a manager. But teaching is limited. To develop a leader, you have to coach them to become someone who can make decisions on their own, communicate well, and plan strategically. But the transition from teacher to coach can be challenging. ... Then practice what Davis calls the “ask first, tell second” method of coaching. “Ask them what’s exciting about this. Then ask what’s scary?” And, since the core skill of coaching is listening, “give them the time and space to answer and listen to what they say,” she says. They might not want to give up the thing they are good at to learn something hard. They might feel jealous of team members who get to keep their hands on the technology. They might fear that others aren’t good enough to do the work they’ve been doing. And they might not yet see the benefits of a leadership role. In the “tell” portion, point out the influence they will have on larger issue in the company, the essential role of managers on the team, the pleasure of helping people grow into larger careers, and how this will give them a seat at the table.


4 characteristics of enterprise application platforms that support digital transformation

The need to deploy applications in various different cloud infrastructures—public cloud, private cloud, physical, virtual, and edge—based on business needs is a key requirement for most established enterprises. As more and more business value is created with the Internet of Things (IoT), edge computing, and artificial intelligence and machine learning (AI/ML), the need to deploy applications across these cloud providers from devices, edge datacenters, on-prem, and colocation providers to the public cloud ecosystem is growing exponentially. For an enterprise, a baseline application platform that can be deployed on all these cloud provider types is essential, if not vital, to support current and future business needs. Another aspect to consider is the growth and distribution of enterprise data. As the famous saying goes, "data is the new oil," and the amount and pace of enterprise data growth are unprecedented. Enterprises are looking at options to leverage this data to create meaningful business insights. 


How to Combine RPA and BPM the Smart Way

Seamless digital integration is more than just cobbling together the best digital solutions on the market. How these advanced technologies interact makes a huge difference. Technologies designed to work together are crucial to achieving the productivity gains promised by digital transformation. With a comprehensive platform, organizations don’t need to worry about building integrations because the platform already includes them. Moreover, a single platform is easier to buy and manage because it comes from the same licensor rather than going through the procurement process with multiple suppliers. Companies need to take care when determining which IA platform to adopt. The benefits of a comprehensive platform are increasingly recognized by vendors and their customers, pushing suppliers to put together multifeatured automation platforms. If companies choose a platform insufficient for their needs, they face reworking costs down the road. Nevertheless, suppose organizations have already taken on technical debt and are looking to rework their digital transformation journey.


5 Technologies Powering Cloud Optimization

Cloud cost management is a critical component of optimization that helps organizations to monitor and manage their cloud spend. The goal is to ensure that organizations are only paying for the cloud resources they actually need and that they are using those resources efficiently. ... Autoscaling is a technology that enables organizations to automatically scale their cloud resources up or down as needed to meet changing demands. The goal of autoscaling is to ensure that organizations always have the right amount of resources to support their workloads while minimizing costs and ensuring that their systems are always available when they are needed. Autoscaling works by monitoring the performance and usage of cloud resources, such as compute instances, storage and network traffic, and automatically adjusting the size of those resources to meet changing demand. ... An API gateway is a server that acts as an intermediary between an application and one or more microservices. The API gateway is responsible for request routing, composition and protocol translation, which enables microservices to communicate with each other securely and efficiently.


Streaming Data Management for the Edge

Managing data at the edge is actually quite easy. What’s hard is how you monetize it. How do you get value from it? How do you take the data that’s streaming into the organization and analyze it, inference on it, and act on it as it’s coming in? How do you use this data to help your customer or stakeholder Think about a retailer who’s trying to do in-store queue management, trying to identify situations where customers are abandoning their carts because the lines are too long, where you’re trying to watch for theft, for shrinkage. It isn’t the management of the data that’s as big a challenge. It is the ability to take that data and make better operational decisions at the point of customer interaction or operational execution. That’s the challenge. And so, we need a different mental frame as well as a different data and analytics architecture that is conducive to the fact that this data that’s coming in, in real time, has value as it’s coming in. Historically, in batch worlds, we didn’t care about real-time data. The data came in. 


DevOps isn’t dead: How platform engineering enables DevOps at scale

Platform engineers could automate almost all this work by building it into an IDP. For example, instead of manually setting up Git repositories, developers can request a repository from the IDP, which would then create it. The IDP would then assign the right user group and automatically integrate the correct CI/CD template. The same pattern applies to creating development environments and deploying core infrastructure. The IDP acts as a self-service platform for developers to request services and apply configurations, knowing security best practices and monitoring are built in by default. IDPs can also automatically set up projects in project tracking software and documentation templates. As you can see, platform engineers don’t replace DevOps processes. They enhance them by building a set of standardized patterns into a self-service internal development platform. This removes the burden of project initialization so teams can start providing business value immediately, rather than spending the first few weeks of a project setting up and working through teething issues.


The Dos and Don‘ts of API Monetization

Before diving into best practices and antipatterns, let’s go over the core technical requirements for enabling API monetization:Advanced metering: Because different customers may have different levels of access to APIs under varying pricing plans, it’s critical to be able to manage access to API requests in a highly granular way, based on factors like total allowed requests per minute, the time of day at which requests can be made and the geographic location where requests originate. Usage tracking: Developers must ensure that API requests can be measured on a customer-by-customer basis. In addition to basic metrics like total numbers of requests, more complex metrics, like request response time, might also be necessary for enforcing payment terms. Invoicing: Ideally, invoicing systems will be tightly integrated with APIs so that customers can be billed automatically. The alternative is to prepare invoices manually based on API usage or request logs, which is not a scalable or efficient approach. Financial analytics: The ability to track and assess the revenue generated by APIs in real time is essential to many businesses that sell APIs. 


How to unleash the power of an effective security engineering team

Security engineering teams should be able to build and operate the services they produce. You build it. You run it. This level of ownership within a group is vital from a technical competence standpoint and culturally, setting the tone around accountability. Technically speaking, a team that can own its services will proficiently manage infrastructure, CI/CD tooling, security tooling, application code, deployments, and the operational telemetry emitted from a service. In addition, the skills backing all that support as a team are likely to be highly transferable in support of other groups across the organization. Teams that understand, embrace, and optimize for DevX are likely more favored. Beyond that, it will have a particular focus on eliminating friction. Friction makes things take longer and cost more, creates longer learning cycles, and can lead to frustration. Less friction will lead to things generally running much smoother. Sometimes friction is necessary and should be intentional. An example is a forced code review on critical code before it's merged. 



Quote for the day:

"Leadership is liberating people to do what is required of them in the most effective and humane way possible." -- Max DePree

Daily Tech Digest - February 08, 2023

The Value of Human Involvement Over Automation

Proponents of this type of automation will say that it saves time in the long run, providing answers and solutions quickly. Automated customer support also allows people to receive help 24/7, eliminating the issue of having to wait until standard business hours to get an issue resolved. Automatic billing, chatbots, and AI-assisted customer service have increasingly become the standard, particularly for e-commerce vendors. When a business owner has an issue — especially if it involves money, fees, or overcharges — dealing with an endless series of automated responses can get frustrating. If those automated responses do not lead them to the help they ultimately need, that frustration can (and likely will) compound further. Even the most intuitive automated solutions can’t replace a human’s personality, emotional response, and instinctual capabilities to adapt to any situation. ... Consumers and entrepreneurs alike are craving human connection in today’s increasingly automated world, so much so that many brands are making the personal relationships they foster with customers an angle of their sales strategy.


CISO Role Undergoes Evolution as Jobs Grow More Complex

“CISOs should also work closely with legal and compliance teams to ensure that the organization is meeting its regulatory requirements,” he says. “Additionally, a CISO should also develop strong relationships with other security leaders in the organization, as well as with external partners such as vendors and industry groups.” He points out other security leadership roles are also changing and include the chief security officer, and the chief privacy officer, with new roles emerging, including the chief data officer and the chief digital officer. MacLeod says as the threat landscape evolves in 2023, the complexity and sophistication of threats will require CISOs to be more proactive in identifying and mitigating risks. ... Eisenberg adds increased scrutiny of organizations’ spending for essential services will require the CISO to business rationalize strategic investments in cybersecurity to counter the ever-increasing complexity and velocity of cyber threats to business resilience.


OpenAI makes a ChatGPT-like tool called Codex that can write software.

If an AI service can now write code for developers, it raises the question of how it will impact students studying computer science, and if highly paid software-engineering jobs will vanish. Still, software-engineering skills will continue to be in high demand, and AI services like Codex are just a natural step as programming becomes progressively easier over generations, according to academics and experts in computer-science education. Codex has been available to developers since 2021 in the form of GitHub Copilot. And OpenAI, which has raised more than $10 billion from Microsoft and other backers, has been investing more in Codex of late, hiring more than 1,000 contractors to write code and associated descriptions that will help Codex learn to become a better programmer, Semafor reported. Codex and ChatGPT are a "huge productivity enhancer," and many programmers are already using Codex in their day-to-day workflows, Christopher Manning, a professor of computer science and linguistics at Stanford University, said. 


Centralizing governance: An increasingly pertinent consideration

Historically it was common for large multinationals to have lawyers around the world. Different jurisdictions have different laws, and it is beneficial to have local specialists that are close to the business. Things are changing, however. There are more opportunities, to outsource specific legal support– to alternative legal service providers or to create an internal shared services operation through the use of innovative legal technology, such as centralized contract and knowledge management and cloud-based workflow tools. This makes the centralization of certain legal activities such as entity management more feasible and more efficient. The downside of a decentralized legal structure in today’s world is becoming increasingly clear. From a management perspective, it is not always easy to align business operations in a decentralized management structure. Decentralized structures have their own authority and bring with them tension between a local and global perspective.


8 steps to turning around a toxic IT culture

An IT leader can’t simply create a positive culture with the wave of a hand. Culture is a function of leadership execution. “Leading by example is the way to start a turnaround,” says Fredrik Hagstroem, CTO of Emergn, a digital business services firm. A clear vision guides team direction. It’s like a compass that reliably points everyone in the correct direction. Even when things are complex or changing, having a goal helps everyone become aligned, Hagstroem says. “Good leadership that drives collaboration and trust will be evident in frequent use of collective and inclusive pronouns, such as ‘we,’ ‘us,’ and ‘ours’ — meaning everyone in the company.” Strong leadership ensures that vision, strategy, and goals aren’t just clearly understood, but attractive and motivating. “Leaders must demonstrate that collaboration and contributions to common goals are more important than individual performance,” Hagstroem says. He advises changing professional relationships from urging team members to reach personal objectives and responsibilities to achieving team goals. 


Implementing an Emerging Risk Analysis Process

While looking for emerging risks, we search for information about our targeted subjects and sweep the events that are happening. The reliability of the information collected has strategic importance in this study. This activity is generally called horizon sensing or horizon scanning. The effort is about looking into the future as widely as possible while trying to see the relevant indicators from the news, trends, cultural tendencies, political events, and natural events. One can easily get lost in this phase or drown in a vast amount of information from the search, since the source of information is not filtered or tested. Unless the information is collected from an official, reliable and reputable source, we need to take it with some healthy skepticism. Otherwise, our results may misinform the strategy level and lead to more significant failures. The information collection has two parts: one is to see what is happening in the world relevant to the business, and the second is the internal view. We look inside the organization and discuss the identified events or changes that may contribute to define emerging risks with relevant stakeholders. 


How data governance and security is transforming shop floor operations in manufacturing

Manufacturers often incur substantial operational costs and invest in in-house software. Also, there are expensive maintenance and other infrastructure costs that manufacturers must keep in mind. As a result, cloud solutions seem better options than traditional on-premises systems. The cloud computing solution can be easily customised as per manufacturer needs with monthly or annual subscription models, thus making a move cost-effective and time-efficient for the manufacturers. Flexibility – With cloud technology, manufacturers can now embrace Agile manufacturing methods that leverage flexibility, rapid iteration, innovation, and augmentation to adapt to technological changes. With the growth of data availability and expanded analytics capabilities, manufacturers can act on real-time data and identify and solve problems promptly to keep up with market demands and maintain a competitive edge. Accelerated Deployment – Shop floor transformations are often associated with expensive and complex legacy MES solutions. 


How to Find the Right Data Governance Model and Derive Principles from Corporate Governance

It is critical to distinguish the term “governance” from “management” in the context of data governance. It should be noted that the principal difference is that governance refers to the decisions that must be made and who must make them. This is to ensure effective resource allocation and management of data operations. On the other hand, data management involves implementing the decisions that arise from assessing and monitoring either existing controls or the environment, including advancements in technology and the market. The activities required for data governance can be distinguished from those needed for data management because management is influenced by governance. Data governance is oversight of data management activities to ensure that policy and ownership of data are enforced in the organization. The emphasis is on formalizing the data management function along with the associated data ownership roles and responsibilities. 


NIST Selects ‘Lightweight Cryptography’ Algorithms to Protect Small Devices

“The world is moving toward using small devices for lots of tasks ranging from sensing to identification to machine control, and because these small devices have limited resources, they need security that has a compact implementation,” she said. “These algorithms should cover most devices that have these sorts of resource constraints.” To determine the strongest and most efficient lightweight algorithms, NIST held a development program that took several years, first communicating with industry and other organizations to understand their needs and then requesting potential solutions from the world’s cryptography community in 2018. After receiving 57 submissions, McKay and mathematician Meltem Sönmez Turan managed a multi-round public review process in which cryptographers examined and attempted to find weaknesses in the candidates, eventually whittling them down to 10 finalists before selecting the winner. “We considered a number of criteria to be important,” McKay said. 


Doing More With Less: How to Survive an IT Budget Cut

Each leader addresses budget cuts in their own way, since there are many drivers, objectives, and desired outcomes that power the need to reduce spending, says Scott Schlesinger, US data and analytics lead at management consulting firm PA Consulting. “Regardless of the reason for the cuts, how the organization executes the strategy, and how effectively the cuts are done, can ultimately be a major factor in the very survival of some organizations.”  Budget cuts can also open the opportunity to reevaluate existing projects. “Are in-flight projects delivering the intended business and technology outcomes?” asks Tim Potter, a principal at business advisory firm Deloitte Consulting. “A thorough assessment of all IT projects is important to validate the achievable ROI and relative importance of each program.” He notes that a budget reevaluation can also help IT leaders to avoid the sunk-cost fallacy -- the tendency to continue investing in projects that aren't yielding the desired benefits simply because a significant amount has already been invested in the project.



Quote for the day:

"No matter what accomplishments you make, somebody helps you." -- Althea Gibson

Daily Tech Digest - February 07, 2023

The On-Premises Empire Strikes Back At AWS

Anybody who can read a financial report knows they are paying too much for compute, storage, networking, and software at Amazon Web Services. It is as obvious as the sun at noon. And it is also obvious – and increasingly so – that the retailing and media businesses at Amazon are as addicted to the AWS profits and the free compute capacity it gives to the corporate Amazon parent as IBM has ever been addicted to the high price of its vaunted mainframe platforms.And so Amazon and therefore AWS is stuck between a rock – increasing competition from Microsoft Azure and Google Cloud, and to a lesser extent Alibaba, Tencent, IBM Cloud, and a bunch of niche cloud builders – and a hard place – the desire to move back to on-premises IT operations, usually in a co-location facility to try to save money compared to buying cloud capacity and software. ... The bad news for Amazon is that this pinch is coming as the parent company is spending vast sums building its media empire and further automating and expanding its electronic retailing and grocery operations. 


Deduplication and Data Stewardship Process in MDM

Match merge is a process that can help identify duplicate data in master data. It takes data from different systems and looks for duplicates or exact matches (and merges them if necessary) to make a "golden copy" of the record. ... The matching process includes match columns and match rules that help recognize similar records in the database, determine customer records for automatic compliance, and determine documents that a data steward should review before consolidation. The matching process consists of two basic techniques (fuzzy match and exact match) that help identify duplicates. In fuzzy matching, base object matches are found through the slowest method. In it, records are matched based on misspellings, transpositions, word combinations, splits, omissions, and phonetic variances. Exact matches make it quicker to compare records whose match columns are identical. Consolidation is the next step after the matching phase. It is rich with queued match records and sent for the merging process. The merged data after compliance is known as the "golden record."


Machine Learning Is as Easy as an API, Says Hugging Face

AI right now seems like the domain of elite experts, but startup Hugging Face plans to “democratize good machine learning” by making it as easy as deploying a REST API. This isn’t theoretical — it’s possible now, with use cases in frontend and web applications, explained Jeff Boudier, head of product and growth at the startup. Hugging Face offers opens source machine learning (ML) models for free on its community site, while charging a fee for infrastructure and service support. ... Boudier estimated customers have deployed about 20,000 projects using AI technologies on their platform. “Frontend, backend, mobile — it all works because we abstract away all of the machine learning and the infrastructure around the machine learning, so at the end of the day, it’s a REST API that you can send requests to — whether it’s from your frontend JavaScript or it’s your backend, or from a mobile client,” Boudier said. “We have a ton of developers and AI startups that are using our models and the inference endpoints … to power user-facing experiences.”


How Quantum Computing Could Affect the Automotive Landscape

Although QC affects all modern digital systems, vehicles are more susceptible to the dangers of QC for several reasons:Vehicles have a relatively long life cycle. New vehicles entering the market today will stay on the road for approximately 15 years, with the current average age of a passenger car in the US being around 12 years and rising. Electric cars are expected to have even longer life spans. ECUs are typically harder to update than personal computers and mobile devices. This is especially true for their cryptographic capabilities, which are often implemented in a dedicated hardware component, called a Hardware Security Module (HSM) or Secure Hardware Extension (SHE), making it impossible to change the cryptographic schemes without a change of hardware. Some ECUs can be updated seamlessly over the air, but many can only be updated in registered service stations using dedicated tools. Vehicles are composed of dozens of ECUs (sometimes more than 100), with the number of ECUs constantly rising in recent years. 


Embracing The Human Element Of Security Automation

Security incident response is historically people-focused: The security team identifies risk and then addresses it. Automation is a compelling solution to the overwhelming amount of security tasks. Still, many organizations are trying to take what is currently a fully manual process and transcribe it almost one-for-one into an automated process. In many cases, this doesn’t work because it fails to account for the institutional knowledge of the security team. Hesitant business leaders should feel confident in investing in automation knowing that the human element is vital to successful implementation. Think of automation as an extension of the security team, not a replacement. Security teams are still your biggest asset, especially in crisis moments when their deep institutional knowledge can give important context and insight that may not be a part of a standard checklist. Many organizations are increasingly seeing the benefit of security automation and the fact that it allows strapped security teams to focus on higher-priority tasks by removing the mundane, repetitive tasks from their plates, rather than taking jobs away.


Best Practices for Promoting Information Security Awareness

The preferred approach to delivering education on information security concepts are not simply a switch to smaller, bite-sized courses, but training that is targeted to meet the role of the learner. While everyone can benefit from knowledge about the basics of information security and how to protect their online footprint, certain positions and academic disciplines require less exposure to the technical components. Rather than disseminate a broad view of topics, it is recommended that content be segmented and factor in ample time for discussion and absorption of the material. Advanced concepts like penetration testing and audits should be reserved for those in technical positions or seeking a similar academic discipline. Keeping curricula generic makes the content you deliver suitable for a broader target audience. Imposing restrictions, such as 45-minute lectures with a maximum of 20 slides, offer the best pacing and depth of material. Including real-world examples, such as a scenario in which an individual receives a phishing email and must decide how to respond, helps put the information into context for the learner.


The profound danger of conversational AI

Whatever we call them, these systems have terrifying vectors for misuse and abuse. I’m not talking about the obvious danger that unsuspecting consumers may trust the output of chatbots that were trained on data riddled with errors and biases. No, I’m talking about something far more nefarious — the deliberate manipulation of individuals through the targeted deployment of agenda-driven conversational AI systems that persuade users through convincing interactive dialog. Instead of firing buckshot into broad populations, these new AI methods will function more like “heat-seeking missiles” that mark users as individual targets and adapt their conversational tactics in real time, adjusting to each individual personally as they work to maximize their persuasive impact. At the core of these tactics is the relatively new technology of LLMs, which can produce interactive human dialog in real time while also keeping track of the conversational flow and context.


Bolstering cybersecurity readiness in the cloud

Firstly, the introduction of Business Risk Observability comes in response with a growing trend of security and observability gradually converging, with organisations now able to leverage a business risk scoring solution via Cisco Secure Application, in Cisco AppDynamics. This first-of-its kind solution is planned to enhance Cisco’s Full-Stack Observability application security solution, combining Kenna Risk Meter score distribution and Business Transactions from Cisco AppDynamics, as well as integrating with Panoptica for API security and Talos for threat intelligence. ... While authentication often proves effective when it comes to preventing unauthorised access, there is always a danger of user fatigue developing as a result of too many constant requests. To help mitigate this, Cisco has launched Duo Risk-Based Authentication, which utilises remembered devices and a Wi-Fi fingerprint to determine legitimate users based on commonly used Wi-Fi networks, while preserving privacy by not storing location data.


Artificial Intelligence and ChatGPT in The Aspirational India

There is a serious threat looming over another AI: Aspirational India, as well as an opportunity. These are the final 12-14 months before general elections in some key states in 2024. We have already seen that India stands out as a bright spot in an otherwise bleak global economic outlook. Aspirational India, which has risen since the 1991 liberalisation under PV Narasimha Rao and has seen growth in the mid-2000s, faced a global recession from 2009 to 2011, recovered from the double whammy of Covid and the much-needed GST regime consolidation. As we begin to recover, we may see politics eclipsing the economic gains made. In the near future, we may also see security issues that call India’s story into question. The neighbours have their own internal issues, but they are not afraid of conflict. As we have seen in recent attacks on the incumbent party by a biassed piece of journalism that challenges Indian institutions and some targeted financial houses to again tarnish financial systems suggesting crony capitalism, India’s story may be challenged and tarnished in the international arena.


Top 6 SaaS Security Threats for 2023

SaaS-to-SaaS app integrations are designed for easy self-service installations, boosting efficiency and functionality. However, these features pose a security nightmare. The challenge is centered on the increasing volume of apps connected to the company’s SaaS environment. On average, thousands of apps are connected without the approval or knowledge of the security team. Employees connect these apps, often to boost productivity, enable remote work, and better build and scale a company’s work processes. However, when connecting apps to their workspaces, employees are prompted to grant permissions for the app to access. These permissions include the ability to read, create, update and delete corporate or personal data, not to mention that the app itself could be malicious. By clicking “accept,” the permissions they grant can enable threat actors to gain access to valuable company data. Users are often unaware of the significance of the permissions they’ve granted to these third-party apps.



Quote for the day:

"It is, after all, the responsibility of the expert to operate the familiar and that of the leader to transcend it." -- Henry A. Kissinger