Daily Tech Digest - March 30, 2023

5 cyber threats retailers are facing — and how they’re fighting back

Retailers are vulnerable to a range of direct e-commerce cyber threats far beyond ransomware. They include hackers altering gift cards and/or the systems used to activate and manage them, swapping barcodes on products to deceive self-checkout systems, defrauding return services via online return forms to obtain refunds for ordered items, hijacking customer accounts to steal their personal information, and stealing credit card numbers through digital skimming. Bot attacks on e-commerce sites are another threat that can’t be ignored. These automated scripts can use a browser to emulate human behavior, including mouse movements and clicks, making them difficult to detect. Advanced bots can hide their real location by routing traffic through anonymous proxies, anonymization networks, or through public cloud services. Bots can facilitate account takeover, through which hackers make fraudulent purchases using data from customer accounts such as gift cards, discount vouchers, and loyalty points, and even saved credit card information.


Google ambushes on-prem PostgreSQL with AlloyDB Omni

Self-managed AlloyDB Omni provides a pathway to modernize legacy databases on-premises before moving to the cloud, analysts said. “Database migrations can be complex and costly, especially when combined with migration from on-premises infrastructure to cloud. AlloyDB Omni provides a pathway for organizations to modernize those workloads in-place by migrating to AlloyDB Omni on-premises,” said Matt Aslett, research director at Ventana Research. “This move can be seen as one step prior to a potential move to the AlloyDB managed service, or with a view to retaining the workloads in on-premises data centers or on edge infrastructure due to sovereignty or performance requirements,” he added. According to Omdia’s Chief Analyst Bradley Shimmin and dbInsight’s Principal Analyst Tony Baer, AlloyDB Omni combines the best of open-source PostgreSQL and Google Cloud’s architecture, making it more appealing than rival services such as AWS Aurora for PostgreSQL and Microsoft’s CitiusDB, among others.


How to Succeed As a New Chief Information Security Officer (CISO)

As a CISO, you probably have an endless to do list of vital chores that can keep you preoccupied. FFor this reason, you may be cut off from your coworkers and superiors, limiting your exposure to strategic and operational information shared through informal channels such as one on ones, small group brainstorming sessions, and, yes, even boring meetings. Stay in touch with your mentor(s) as you make this shift. Having a clear idea of your challenges and working with a coach can help your CISO first 90 days and adjust more smoothly. Participate in the discussion to better understand the company’s goals, potential, and threats. Building productive relationships with employees and other divisions is crucial to your success as a chief information security officer. Coordinate early on with the major players by setting up a meeting schedule. Determine which divisions you will work with, such as legal, audit, risk, marketing, and sales. As a result, we will be better able to establish connections to facilitate the rollout of cybersecurity awareness campaigns and related policies. The CISO needs to work in tandem with other executives.


Why it is time all businesses became data-driven

Encouragingly, the future of data skills is in safe hands. Data from the British Computer Society (BCS) recently revealed that interest in computing degrees is growing more than other courses. According to BCS data, 92,980 18-year-olds applied to start computing degrees this year in the UK, a 9.6% rise demonstrating the sector’s continued appeal. The Society believes that increasing interest in the degree course was likely a result of the higher profile of AI and a realisation about the career prospects for computing graduates in areas like cyber security and climate change data science. For those organisations looking to increase their data skills sooner, though, several programmes in the market can match you with a data scientist. As the now old adage goes – data is the new oil. With its value likely to continually increase, those organisations that do not recognise its value will undoubtedly be left behind, allowing competitors to overtake them. With no sign of market conditions easing, now is the time to evolve. Not doing so could be devastating.


Clouds vs Edges: Which Computing Wins the Race?

One of the key benefits of edge computing is its ability to reduce latency, or the delay between a user’s request and the response. With traditional cloud computing, data is sent to a central data center for processing, which can result in latency that is too long for certain applications. Edge computing can help address this issue by bringing computation closer to the source of the data, reducing the time it takes to process the data. Another advantage of edge computing is its ability to support real-time data processing and analysis. Edge computing allows applications to react quickly to changes in data by processing it closer to where it is being produced, which is crucial for applications that require real-time data processing, like self-driving cars and industrial automation. ... Both edge computing and cloud computing are critical components of modern computing infrastructure, and each has its advantages and disadvantages. Edge computing enables data processing and analysis to occur closer to the source of data, resulting in lower latency, improved reliability, and greater security. 


CEOs feel responsible for security, but ill at ease on stepping up

This gap in perception, according to the research, lies partly in the meaning of accountability: instead of seeing themselves as accountable – being the face of the mistake – CEOs should assume co-responsibility for cyber resilience together with their CISO. Second, CEOs should stay away from blindly trusting their technology teams. Instead, they should move to a state of informed trust about their enterprise’s cyber resilience maturity. Third, CEOs should embrace what the authors call the “preparedness paradox” — an inverse relationship between the perception of preparedness and resilience. The better-prepared CEOs think their organisation is for a serious cyberattack, the less resilient their organisation likely is, in reality. And fourth, CEOs should adapt their communication styles to regulate pressure from external stakeholders who have different and sometimes conflicting demands. Depending on the stakeholder and the situation, CEOs should either be a transmitter, filter, absorber or amplifier of pressure.


Why exams intended for humans might not be good benchmarks for LLMs like GPT-4

Exams designed for humans assume that the test-taker already possesses these preparatory skills and knowledge, and therefore do not test them thoroughly. On the other hand, language models have proven that they can shortcut their way to answers without the need to acquire prerequisite skills. “Humans are presumably solving these problems in a different, more generalizable way. Thus we can’t make the assumptions for LLMs that we make for humans when we give them tests,” Mitchell said. For instance, part of the background knowledge for zoology is that each individual is born, lives for a while and dies, and that the length of life is partly a function of species and partly a matter of the chances and vicissitudes of life, says computer scientist and New York University professor Ernest Davis. “A biology test is not going to ask that, because it can be assumed that all the students know it, and it may not ask any questions that actually require that knowledge. But you had better understand that if [you’re going to be] running a biology lab or a barnyard,” Davis told VentureBeat.


12 Places to Intervene - Rethink FinOps Using a Systems Thinking Lens

When we analyze a problem, we often try to look for certain points or places where we can focus our efforts to gain maximum leverage on the system to achieve our desired outcome. For example, if you are in a situation to lift a motorbike that’s fallen down, you don’t pull up from every part that you can hold on to. Instead, you find the best part and a way to hold (based on the bike design and your strengths) to pull up with the least effort and damage. Similarly, for complex socio-technical systems, Donnella Meadows, in her book Thinking in Systems, proposes 12 places where you can "intervene" to achieve maximum impact. These are known as leverage points or points of intervention. A system intervention is a deliberate effort to change or improve a system’s behavior, processes, or outcomes. It involves identifying problems and implementing changes to improve the overall functioning of the system. Donella Meadows introduces the 12 leverage points in the increasing order of leverage. 


Employee engagement: Why it matters

A dominant theme that emerged in Professor Heskett’s research is that organizations “lack leadership talent with the attitude, training, and willingness to devote time to the difficult task of engagement.” Too often, he points out, “an annual employee survey is taken, trends analyzed and reported back, opportunities for improvement discussed … and management returns to handling other primary responsibilities.” Dr. Heskett’s research also suggests that managers often put too much emphasis on “making the numbers” to the exclusion of other goals. On the other hand, if employee engagement is the goal, “making the numbers” would come much more easily. According to Heskett, engaged workers are more likely to remain on the job, are more productive and higher-performing, and foster higher levels of customer engagement – all of which boost profits and growth. Assessing and evaluating workplace engagement levels can be challenging, and there’s no single standard approach.


Examining key disciplines to build equity in the IT workplace

It can be difficult for women to have a sense of belonging facing these challenges. Speaking of the senior tech leader at the leadership table, there’s underrepresentation of women and even more underrepresentation of Black women. So resilience, fuelled by self reliance and confidence, helps to navigate a career path. “Being in a minority can bring self-doubt, especially if you’re in an environment that isn’t supportive or causes doubts,” she says. “So know the value you bring to the table and the difference you’re making. Some environments are going to appreciate this more than others, but it’s important you don’t let others minimize your contributions. For example, if you work hard and lead your team to launch a tech solution that positively impacts the organization’s bottom line, that is value you can quantify. Having said that, we still have a ways to go about women in tech still being overlooked and passed over for promotions. The numbers are getting better, but we’re still there.”



Quote for the day:

"Teamwork is the secret that make common people achieve uncommon result." -- Ifeanyi Enoch Onuoha

Daily Tech Digest - March 29, 2023

9 Qualities of a Successful CTO

No one expects CTOs to be fortune tellers, but they do need to have a strong sense of what’s going on in the technology marketplace. A good CTO anticipates what is likely to come along in terms of new products, features, and challenges to address.You can be the best technologist and strategist in the world, but it won’t matter if you are unable to communicate those strategies in a way that speaks to your audience. “To excel as a CTO, it is essential to have a keen ability to identify technology trends ahead of the curve,” says Aron Brand, CTO of CTERA, a provider of cloud-based products. “A successful CTO is always on the lookout for the latest advancements in technology, having a deep understanding of the industry and anticipating future developments,” Brand says. “This allows them to make informed decisions about which technologies to invest in and which to avoid. They have the foresight to see the big picture and understand the long-term impact of their decisions, while also considering the immediate needs of the organization.”


ChatGPT Data Breach Confirmed as Security Firm Warns of Vulnerable Component Exploitation

According to OpenAI’s investigation, the titles of active users’ chat history and the first message of a newly created conversation were exposed in the data breach. The bug also exposed payment-related information belonging to 1.2% of ChatGPT Plus subscribers, including first and last name, email address, payment address, payment card expiration date, and the last four digits of the customer’s card number. This information may have been included in subscription confirmation emails sent on March 20 and it may have also been displayed in the subscription management page in ChatGPT accounts on the same day. OpenAI has confirmed that the information was exposed during a nine-hour window on March 20, but admitted that information may have been leaked prior to March 20 as well. “We have reached out to notify affected users that their payment information may have been exposed. We are confident that there is no ongoing risk to users’ data,” OpenAI said in a blog post.


5 ways to tell you are not CISO material

By definition, a CISO's role is to manage cyber risk. That involves assessing and managing risk across the enterprise and making choices based on those assessments. If you are not able to make risk-based decisions or have a hard time figuring out how to prioritize threats — particularly in high-pressure, high-stress situations — you probably want to steer clear of the CISO role. The same is true if you have a tendency to avoid taking responsibility for your decisions and actions. The CISO role is not for individuals who are averse to taking responsibility for an action they might advocate or implement, according to Chris Pierson, founder and CEO of Blackcloak. "If you approach things from the perspective of a CYA, adversarial, or risk avoidance mentality then you may decrease your ability to partner with others to achieve a combined mission or goal," Pierson tells CSO. "Being someone who cannot tolerate, or own risk, may impact your ability to operate effectively and turn off other people to partnering with you."


Being Responsible for Data

There are four possibilities listed in the article for what could happen, and I find them fascinating from a data analysis standpoint. Essentially a ruling against tech companies could shape how many of these companies process data in the future. While we might like to ensure these companies do not promote harmful content, think about this from the data analysis view? Do you want these companies to moderating how they provide results? Would this mean that we need to more carefully craft our search terms? In the context of tremendous floods of information, we often depend on Google, Bing, or some search algorithm to distinguish among the various meanings of words to bring back results relevant to us. At the same time, we might wish that everyone got the same results from the same search terms. Separate from the results, what about related results, or suggested items that might be related. I find the quality of these can vary for me, but often there is something "sponsored" or "I might like" that is helpful to me. Or just interesting. 


How CISOs Can Reduce the Danger of Using Data Brokers

"Due to the increasing regulatory compliance framework regarding data collection notice and consent, there are data brokers that have huge subsets of their data that is not 'clean' and they cannot make reps and warranties about it to third parties that want to leverage that data," says Sean Buckley, an attorney with law firm Dykema who specializes in data privacy issues. "The risk to the data broker circles back to whether their data is 'clean' and whether they can prove it if necessary." ClearData CISO Chris Bowen argues that data tracking is critical when dealing with purchased files, but it can also prove quite difficult — even impossible — if the organization didn't tag it sufficiently from the beginning. "You need to closely track where the data lives and where it flows," Bowen says. "You need to tag the source of each field in the database. You need consistent links through petabytes of data, structured and unstructured." Most security executives are not comfortable with this approach because dataflow analysis is outside of their usual remit, he adds.


Unlocking Digital Business Transformation Success

It is often misunderstood that technology is at the heart of digital transformation. Although technology could create exponential possibilities in the current digital economy, it is really the transformation part—the journey an organization takes with its ecosystem of people—that creates the solid foundation to accelerate these opportunities. Buy-in is essential for achieving long-term sustained success in the context of digital business transformation, where initiatives may be complex and significantly impact the business. There are a few inventive strategies that can successfully gain people’s buy-in and influence them to change both their attitudes and behaviors. To start, trust and empathy are the foundational components that lay the groundwork for buy-in and effective collaboration, and it is at the center of digital transformation strategies. The role of the leadership team shifts from being directive to one that promotes a safe, open and trustworthy environment. Another key element in engaging the human element for transformation is to focus on adding value.


UK DPDI bill seeks to reform GDPR - here's what you need to know

"Clarifications around legitimate interests, scientific research and automated decision making are bound to make it easier for companies to explore the potential of new technologies and AI without worrying about the risk of technical non-compliance with rules that lack clarity," he told TechRadar. From a user's perspective, the proposal is said to also be advantageous in coping with the issues of the so-called "pop-up fatigue." The term describes the act of consumers clicking away their rights of privacy in order to escape repetitive and annoying cookies. "But users will probably see little practical differences. Cookie consents will still be needed for many advertising-related cookies (and many businesses may adopt a single EU-level approach). This is at least until browser based controls are more developed," said Patrikios ... "The DPDI Bill is a power grab by the government that will undermine data rights in the UK. The bill weakens data subjects rights and corporate accountability mechanisms, politicizes the ICO, and expands the Secretary of State’s powers in numerous, undemocratic ways," said Abigail Burke


Why a College Degree is No Longer Necessary for IT Success

“Some the most talented, brilliant technical professionals I know … who are currently leading top tech research roles and holding executive positions at prestigious organizations, do not have degrees,” she says. “Smart organizations recognized their talent; their success speaks for itself.” With many IT skills, including software development and data science, it’s important for learners to gain hands-on experiences where they're practicing and applying their skills in real-time, observes Mike Hendrickson, vice president of tech and dev products at educational technology firm Skillsoft. “Many online learning platforms provide interactive, flexible training solutions that meet people where they are, whether they're learning independently or within their organization.” Hendrickson believes that online training can be far more efficient than a four-year college program. “Another benefit is this training can be tailored to company-specific or industry-focused content and solutions, so learners can practice and apply their skills to real work environments and scenarios,” he explains.


6 ways to avoid and reduce data debt

Like technical debt, data debt is easier to identify after its creation. Data debt often requires teams to refactor or remediate the issues before building data pipeline improvements or new analytics capabilities. Implementing best practices that minimize new data debt is harder, especially when teams can’t predict all the future analytics, dashboarding, and machine learning use cases. Michel Tricot, cofounder and CEO of Airbyte, says, “Debt is not bad. However, debt needs to be repaid, which should be the focus because important decisions will be made with the data.” ... “Data observability is when you know the state and status of your data across the entire life cycle,” says Grant Fritchey, devops advocate at Redgate Software. “Build this kind of observability when you set up a dataops process to know if and where something has gone wrong and what’s needed to fix it.” Grant also says that data observability helps communicate data flows to business users and establishes an audit trail to support debugging and compliance audits.


The Role of Human Resources in Cybersecurity

Developing an effective cybersecurity awareness training program requires a balance between providing enough information to be useful and not overwhelming. Human resources’ expertise with employees through the years is an invaluable resource for creating cybersecurity training programs that are engaging and frequent (but not too frequent). The CIO, on the other hand, is an essential partner in training employees on cybersecurity. The CIO’s role is to work with the human resources department to ensure their technology needs are met and help guide them to more effective solutions. The CIO is also a partner for employee recruitment, hiring and retention, especially for IT and security professionals. The CIO can affect organizational change by partnering with human resources and IT to develop an integrated cybersecurity awareness training program for employees of all technical proficiencies. Building upon HR’s close connection with every employee, the CIO can lead the way in building a culture of cybersecurity.



Quote for the day:

"Leadership is the wise use of power. Power is the capacity to translate intention into reality and sustain it." -- Warren Bennis

Daily Tech Digest - March 28, 2023

Predictive network technology promises to find and fix problems faster.

The emerging field of neuromorphic computing, based on a chip architecture that's engineered to mimic human brain structure, promises to provide highly effective ML on edge devices. "Predictive network technology is so powerful because of its ability to intake signals and make accurate predictions about equipment failures to optimize maintenance," says Gil Dror, CTO at monitoring technology provider SmartSense. He says that neuromorphic computing will become even more powerful when it moves from predictive to prescriptive analytics, which recommends what should be done to ensure future outcomes. Neuromorphic computing's chip architecture is geared toward making intelligent decisions on edge devices themselves, Dror says. "The combination of these two technologies will make the field of predictive network technology much more powerful," he says. Organizations including IBM, Intel, and Qualcomm are developing neuromorphic computing technologies. 


What Wasm Needs to Reach the Edge

As of today, WASM is very much present in the browser. It is also rapidly being used for backend server applications. And yet, much work needs to be done as far as getting to the stage where applications can reach the edge. The developer probably does not care that much — they just want their applications to run well and security wherever they are accessed, without wondering so much about why edge is not ready yet but when it will be. Indeed, the developer might want to design one app deployed through a WebAssembly module that will be distributed across a wide variety of edge devices. Unlike years past when designing an application for a particular device could require a significant amount of time to reinvent the wheel for each device type, one of the beautiful things about WASM — once standardization is in place — is for the developer to create a voice-transcription application that can run not only on a smartphone or PC but in a minuscule edge device that can be hidden in a secret agent’s clothing during a mission. 


5 hard questions every IT leader must answer

Most of the voluminous academic literature on leadership focuses on the traits/idiosyncrasies of the individual leader and not on their relationships with key associates. As an IT leader, do you have a track record of helping or hindering colleagues in fulfilling their career objectives? Vince Kellen, a digital force of nature and CIO at University of California San Diego, borrows insights from NHL scouts. He is looking for IT “skaters” who, when they step onto the ice, make the other four teammates better hockey players. How leaders view themselves and others and how they are viewed by others is a critical causal driver of leadership success or failure. Tony Blair was able to reverse a multi-decade decline in Labour Party electoral success when he realized, “People judge us on their instincts about what they believe our instincts to be. And that man polishing his car was clear: His instincts were to get on in life, and he thought our instincts were to stop him.” 


KPIs for a Chief Information Security Officer (CISO)

Many might think the finances of a company would be the sole responsibility of the chief financial officer and their team. However, the CISO is also responsible for returns on any investments in information security. This is a crucial benchmark for a CISO. They’re responsible for the organization gaining value from new security technology investments and security policies while keeping costs down. They must also maintain a productive department — which in financial terms means valuable — and a training program worth investing in (CISO-Portal, 2021). While CISOs are responsible for security, they also must consider the financial impact on the business if a cyberattack occurs. An estimated recovery budget should be put in place to prepare for the potential financial impact of the attack. The actual cost should be equal to or less than the budgeted total and include direct costs, indirect costs, and possible fines (Castellan). One key metric CISOs can use to gauge security team effectiveness is IT security staff job satisfaction.


Microsoft Security Copilot harnesses AI to give superpowers to cybersecurity fighters

With Microsoft Security Copilot, defenders can respond to incidents within minutes, get critical step-by-step guidance through natural language-based investigations, catch what would otherwise go undetected, and get summaries of any process or event. Security professionals will be able to utilize the prompt bar to ask for summaries on vulnerabilities, incidents in the enterprise, and even more information on specific links and files. Using generative AI and both internal and external organizational information, Copilot generates a response with reference to sources. Like most AI models, it won't always perform perfectly and it can make mistakes. However, Security Copilot works in a closed-loop learning system that includes a built-in tool for users to directly provide feedback. And while at launch it will incorporate Microsoft's security products, the company claims that over time it will "expand to a growing ecosystem of third-party products" as well. 


Plugging the cybersecurity skills gap to retain security professionals

Worryingly (but entirely unsurprisingly), any organisation facing a cyber skill gap is much more susceptible to breaches. Indeed, industry body ISACA found that 69% of those organisations that have suffered a cyber-attack in the past year were somewhat or significantly understaffed. What truly compounds these concerns, however, is the potential impact that breaches can have. According to IBM’s Cost of a Data Breach Report 2022, the average total cost of a data breach is now $4.35 million. This combination of statistics is undoubtedly anxiety-inducing. However, attacks aren’t a lost cause or an inevitability which simply can’t be prevented. ... It should be noted that, at least in most cases, organisations are not doing this to eliminate the need for cybersecurity workers altogether. Artificial intelligence is nowhere near the level of sophistication required to achieve this in a security context. And really, it’s unlikely that human input won’t ever be required, at least in some capacity.


Manufacturing is the most targeted sector by cyberattacks. Here's why increased security matters

One of the manufacturing sector’s main struggles is having a fragmented approach to managing cyber-related issues. In the European Union, a new legislative proposal, the Cyber Resilience Act, is being discussed to introduce the mandatory cybersecurity requirements for hardware and software products throughout their lifecycle. Moreover, the new NIS 2 and Critical Entities Resilience (CER) directives classify certain manufacturing industries as important or “essential entities,” requiring them to manage their security risks and prevent or minimize the impact of incidents on recipients of their services. In the United States, various federal regulations have been imposed on specific sectors like water, transportation and pipelines and a national cybersecurity strategy was recently released. The International Electrotechnical Commission’s IEC 62443 is considered by many to be the primary cybersecurity standard for industrial control systems but it is complex. 


Tony McCandless – The role of generative AI in intelligent automation

Firstly, there are a lot of financial services companies that are at the forefront of customer experience, to some degree, because they’ve got particular products to sell that lend themselves well to AI. They can implement capabilities like data analytics in order to know whether a customer is likely to buy a certain product. And then, they can reach out to companies like us and utilise a choice of generative AI-powered scenarios. This looks set to continue evolving. I think as well, in areas like citizen services — certainly from a UK perspective — most councils are really cash strapped, and are having to make critical decisions about service provision to citizens. There is also a digital access gap that we have to focus on closing. While some councils are proving good at addressing this, others potentially need a bit more investment, and collaboration. We’ve got 10 grandkids, and you should see a couple of the younger ones with technology — their ability to pick up a tablet, without knowing what a keyboard is, is just mind blowing.


Q&A: Cisco CIO Fletcher Previn on the challenges of a hybrid workplace

Our policy around hybrid work is that we want the office to be a magnet and not a mandate. In all likelihood, the role of the office is for most people not going to be a place where you go eight hours a day to do work. It’s going to be a place where we occasionally gather for some purpose. And, so as a result, we’re not mandating any particular prescriptive for how many days people should be in the office. It’s totally based on the type of work teams do, how collaborative that works needs to be, does it really benefit from people being together, or is it really individual work. And that’s really best determined at the individual team level than any sort of an arbitrary formula. The value of being in the office is proportionate to the number of other people who are also in the office at the same time you’re there. So, these things tend to be more about gathering for a team meeting, a client briefing, a white boarding session and the like. When everybody was remote, it was a great equalizer because everyone was on a similar footing.


Pursuing Nontraditional IT Candidates: Methods to Expand Talent Pipelines

Felicia Lyon, principal, human capital advisory for KPMG, says developing a strategy for nontraditional hires should start with leadership setting forth a vision for talent that is inclusive and skill-based. “Execution of that strategy will require involvement from stakeholders that span the entire organization,” she explains. “Business stakeholders should work closely with HR to identify roles that will be a good fit.” She adds that while there is a tendency to start small via pilot programs, research has shown that cohort programs are more efficient. “Companies should also look to external partners like apprenticeship programs and community colleges who can help them build a capability around successfully supporting and developing non-traditional talent,” Lyon says. Watson explains Clio uses many overlapping programs to widen the net of candidates in technical roles. “Our talent acquisition team helps identify opportunities to recruit non-traditional areas of available talent,” he says. 



Quote for the day:

"If I have seen farther than others, it is because I was standing on the shoulder of giants." -- Isaac Newton

Daily Tech Digest - March 27, 2023

Primary Reasons That Deteriorate Digital Transformation Initiatives

Fallacious organizational culture can readily fail transformation initiatives. Adopting better cultural changes is as essential as digital transformation. Hence, businesses must embrace the cultural differences that digital transformation demands. IT initiatives require amendments in products, internal processes, and better engagement with customers. As per a recent report by Tek Systems, “State Of Digital Transformation,” 46% of businesses believe digital transformations enhance customer experience and engagement. To achieve these factors, involving teams from various departments is an excellent way to work together more coherently. Not having a collaborative culture across enterprises can be a significant reason for digital transformation failures. Establishing a change management process is recommended to bring the needed cultural change. This process can help identify people actively resistant to change, followed by adequate training and education, and transform them to adopt the cultural difference quickly.


Why data leaders struggle to produce strategic results

The top impediment? Skills and staff shortages. One in six (17%) survey respondents said talent was their biggest issue, while 39% listed it among their top three. And the tight talent pool isn’t helping, Medeiros says. “CDAOs must have a talent strategy that doesn’t count on hiring data and analytics talent ready-made.” To counter this, CDAOs need to build a robust talent management strategy that includes education, training, and coaching for data-driven culture and data literacy, Medeiros says. That strategy must apply not only to the core data and analytics team but also the broader business and technology communities in the organization. ... Strategic missteps in realizing data goals may signal an organizational issue at the C-level, with company leaders recognizing the importance of data and analytics but falling short on making the strategic changes and investments necessary for success. According to a 2022 study from Alation and Wakefield Research, 71% of data leaders said they were “less than very confident” that their company’s leadership sees a link between investing in data and analytics and staying ahead of the competition.


A Roadmap For Transitioning Into Cybersecurity

The best way to discover information is to search for specific points related to the categories above. For example, if I'm looking at SQL injection vulnerabilities, I would specifically input that into Google and try to learn as much as I can about SQL injection. I don't recommend relying on just one resource to learn everything. This is where you need to venture out on your own and do some research. I can only provide examples of what good material looks like. In the beginning, your approach will likely be more theoretical, but I firmly believe that the most effective way to learn is through practical experience. Therefore, you should aim to engage in hands-on activities as much as possible. ... Writing blog posts helps you solidify your understanding of a topic, improve your communication skills, and build an online portfolio that showcases your expertise. Opinion: Producing blog posts is an excellent way to engage with the community, share your knowledge, and give back. Plus, it helps you establish a personal brand and network with like-minded professionals.


What Are Microservices Design Patterns?

Decomposition patterns are used to break down large or small applications into smaller services. You can break down the program based on business capabilities, transactions, or sub-domains. If you want to break it down using business capabilities, you will first have to evaluate the nature of the enterprise. As an example, a tech company could have capabilities like sales and accounting, and each of these capabilities can be considered a service. Decomposing an application by business capabilities may be challenging because of God classes. To solve this problem, you can break down the app using sub-domains. ... One observability pattern to discuss is log aggregation. This pattern enables clients to use a centralized logging service to aggregate logs from every service instance. Users can also set alerts for specific texts that appear in the logs. This system is essential since requests often spam several service instances. The third aspect of observability patterns is distributed tracing. This is essential since microservice architecture requests cut across different services. This makes it hard to trace end-to-end requests when finding the root causes of certain issues.


A New Field of Computing Powered by Human Brain Cells: “Organoid Intelligence”

It might take decades before organoid intelligence can power a system as smart as a mouse, Hartung said. But by scaling up production of brain organoids and training them with artificial intelligence, he foresees a future where biocomputers support superior computing speed, processing power, data efficiency, and storage capabilities. ... Organoid intelligence could also revolutionize drug testing research for neurodevelopmental disorders and neurodegeneration, said Lena Smirnova, a Johns Hopkins assistant professor of environmental health and engineering who co-leads the investigations. “We want to compare brain organoids from typically developed donors versus brain organoids from donors with autism,” Smirnova said. “The tools we are developing towards biological computing are the same tools that will allow us to understand changes in neuronal networks specific for autism, without having to use animals or to access patients, so we can understand the underlying mechanisms of why patients have these cognition issues and impairments.”


'Critical gap': How can companies tackle the cybersecurity talent shortage?

The demand for cybersecurity is on the rise with no signs of it slowing down anytime soon. The cybersecurity talent shortage is a challenge, but that doesn't mean it has to be a problem. Companies today are taking critical steps to bridge this gap through innovative ways. It is imperative to deploy skilled data security professionals who can focus on critical thinking and innovation, allowing the automated bots to take over the tedious, repetitive tasks. With this, companies can predict and be in front of even the most sophisticated cyber-attacks without them having to hire more manpower to account for them. ... Cybersecurity is critical to the economy and across industries. This field is a good fit for professionals looking to solve complex problems and navigate the different aspects of client requirements. The first step to building a career in data security is by entering the tech workforce. Pursuing an associate degree, bachelor’s degree, or online cybersecurity degree should create a smooth gateway to the sector.


The Economics, Value and Service of Testing

Clearly writing tests is additional to getting the code correct from the start, right? If we could somehow guarantee getting the code correct from the start, you could argue we wouldn’t need the written tests. But, then again, if we somehow had that kind of guarantee, performing tests wouldn’t matter much either. The trick is that all testing is in response to imperfection. We can’t get the code “correct” the first time. And even if we could, we wouldn’t actually know we did until that code was delivered to users for feedback. And even then we have to allow for the idea that notions of quality can be not just objective, but subjective. ... When people make arguments against written tests, they are making (in part) an economical argument. But so are those who are making a case for written tests. When framed this way, people can have fruitful discussions about what is and isn’t economical but backed up by judgment. Judgment, in this context, is all about assigning value. Economists see judgment as abilities related to determining some payoff or reward or profit. Or, to use the term I used earlier, a utlity.


Operation vs. innovation: 3 tips for a win-win

One of the biggest innovation inhibitors is the organizational silo. When IT and business teams don’t collaborate or communicate, IT spins its wheels on projects that aren’t truly aligned with business goals. When IT teams are simply trying to “keep the lights on,” a lack of alignment between business leaders  and IT (who may not fully understand how projects support business objectives) is a recipe for failure. On the technology side, innovations like low-code and no-code applications are helping to bridge the technical and tactical gap between business and IT teams. With no-code solutions, business users can build apps and manage and change internal workflows and tasks without tapping into IT resources. IT governance and guardrails over these solutions are important, but they free up time for IT and software teams to work on higher-level innovation. ... Internal IT teams are often equipped with and experienced in maintaining operational excellence. Depending on the company, these internal teams may also excel at building new solutions and applications from the ground up.


Cloud Skills Gap a Challenge for Financial Institutions

“While the cloud seems like a very simple technology, that’s not the case,” he says. “Not knowing the cloud default configurations and countermeasures that should be taken against it might keep your application wide open.” Siksik adds that changes in the cloud also happen quite frequently and with less level of control over changes than the bank normally has. “The cloud is open to many developers and DevOps, which could push a change without the proper change process, as things are more dynamic,” he explains. “This mindset is new to banks, where normally you will have strict and long change processes.” James McQuiggan, security awareness advocate at KnowBe4, explains cloud architects design and oversee the bank's cloud infrastructure implementation and need experience in cloud computing platforms and knowledge of network architecture, security, and compliance. “The security specialists are to ensure the bank's cloud environment is secure and compliant with any applicable regulatory requirements,” he adds.


The era of passive cybersecurity awareness training is over

Justifying the need for cybersecurity investment to the executive team may be challenging for tech leaders. Compared to other business functions, the return from investing in IT security could be more apparent to executives. However, the importance of investing in a strong security posture becomes more evident when compared to the damage from data breaches and ransomware attacks. By highlighting savings in terms of improved quality of execution of cybersecurity policies and improved IT productivity through automation, it becomes easier to articulate the value of cybersecurity initiatives to the executive team. Modern social engineering attacks often use a combination of communication channels such as email, phone calls, SMS, and messengers. With the recent theft of terabytes of data, attackers are increasingly using this information to personalize their messaging and pose as trusted organizations. In this context, organizations can no longer rely on a passive approach to cybersecurity awareness training. 



Quote for the day:

"Leadership is being the first egg in the omelet." -- Jarod Kintz

Daily Tech Digest - March 26, 2023

What Is Decentralized Identity?

Decentralized identities are not hosted on centralized servers by big entities such as Google or Meta Platforms (the former Facebook). Instead, they are often hosted on decentralized file-sharing platforms, such as the InterPlanetary File System (IPFS). These open-source protocols store data on decentralized networks that are difficult to shut down and give users ownership over their online data. In addition, decentralized identities only share information with other parties when and if they choose. This means that, unlike centralized identities, personal data cannot be stored or shared without the user's knowledge or consent. According to the Ethereum Foundation, decentralized identities can be used for many things, such as a universal login to reduce the need for separate usernames passwords, as a way to bypass know-your-customer (KYC) measures and to create online communities that are free of bots and fake accounts.


Why we need to care about responsible AI in the age of the algorithm

The rapid pace of AI development does not appear to be slowing down. Breakthroughs come fast – quickly outpacing the speed of regulation. In the past year alone, we have seen a range of developments, from deep learning models that generate images from text, to large language models capable of answering any question you can think of. Although the progress is impressive, keeping pace with the potential harms of each new breakthrough can pose a relentless challenge. The trouble is that many companies cannot even see that they have a problem to begin with, according to a report released by MIT Sloan Management Review and Boston Consulting Group. ... Responsible AI is more than a check box exercise or the development of an add-on feature. Organizations will need to make substantial structural changes in anticipation of AI implementation to ensure that their automated systems operate within legal, internal and ethical boundaries.


Uncovering new opportunities with edge AI

Edge AI and edge ML present unique and complex challenges that require the careful orchestration and involvement of many stakeholders with a wide range of expertise from systems integration, design, operations and logistics to embedded, data, IT and ML engineering. Edge AI implies that algorithms must run in some kind of purpose-specific hardware ranging from gateways or on-prem servers on the high end to energy-harvesting sensors and MCUs on the low end. Ensuring the success of such products and applications requires that data and ML teams work closely with product and hardware teams to understand and consider each other’s needs, constraints and requirements. While the challenges of building a bespoke edge AI solution aren’t insurmountable, platforms for edge AI algorithm development exist that can help bridge the gap between the necessary teams, ensure higher levels of success in a shorter period of time, and validate where further investment should be made.


IT Automation vs. Orchestration: What's the Difference?

IT automation refers to the use of technology to automate tasks and processes that would otherwise be done by someone on your team. This includes everything from communication to security tasks. Today, the appeal of this automation is greater than it has ever been in the corporate world. One study shows that more than 30% of organizations have five or more departments that automate tasks. ... Orchestration is about coordinating tasks and processes into workflows. Orchestration is the process of automating and managing the end-to-end flow of IT services, from initial requests to final delivery. This can include everything from provisioning new servers to deploying applications and monitoring performance. The benefits of orchestration are similar to those of IT automation but they extend beyond simple task execution. They enable organizations to coordinate and manage complex workflows across multiple systems, tools, and teams. This improves efficiency and reduces the chance of errors on a larger scale.


Critical flaw in AI testing framework MLflow can lead to server and data compromise

MLflow is written in Python and is designed to automate machine-learning workflows. It has multiple components that ​​allow users to deploy models from various ML libraries; manage their lifecycle including model versioning, stage transitions and annotations; track experiments to record and compare parameters and results; and even package ML code in a reproducible form to share with other data scientists. MLflow can be controlled through a REST API and command-line interface. All these capabilities make the framework a valuable tool for any organization experimenting with machine learning. Scans using the Shodan search engine reinforce this, showing a steady increase of publicly exposed MLflow instances over the past two years, with the current count sitting at over 800. However, it's safe to assume that many more MLflow deployments exist inside internal networks and could be reachable by attackers who gain access to those networks.


Can Security Keep Up With ChatGPT Evolutions?

As with most technological developments, there are two sides of the coin. ChatGPT may present businesses with a never-ending pool of opportunities, but the same resource is available to those with more malicious intent. While ChatGPT itself cannot be directly targeted by cybersecurity threats like malware, hacking or phishing, it can be exploited to help criminals infiltrate systems more effectively. The platform’s developers have taken steps to try to reduce this as much as possible, but it takes just one attacker to word their question in the right way to get the desired response. The best example here is phishing. Asking the platform to generate a phishing template directly will result in the chatbot refusing. However, if someone with malicious intent rewrote their question ever so slightly, the AI won’t detect any issue. For example, if you ask it to create a ‘gophish’ template, it will comply. The advanced capabilities of ChatGPT throws up several red flags for security teams, but it isn’t time to hit the doomsday button just yet.


Creating Strong ROI for Multi-Cloud Solutions Through Compliance & Security

When it comes to budget, storing data in the cloud eliminates the need to pay upfront for physical hardware and services. Predictable subscription services fees without capital expenses means organizations can lower their overall costs and invest the savings in other areas that drive innovation. Take for example, a healthcare organization that moves its critical on premise infrastructure into the cloud. In doing so, the organization immediately saves enough on its capital expense budget to add much needed additional healthcare staff ready to serve patients. With regard to gaining intelligence, the data that can be gathered in a single or multi-cloud environment makes it infinitely easier to analyze and gain actionable insights that would otherwise be unavailable. This level of data-driven analytics and intelligence is powerful as it can be directly applied to customer service and operational performance improvements. Multi-cloud solutions also make scaling up and down to meet demand extremely simple and efficient. 


6 Myths About Leadership That May Be Holding You Back

While it is true that leaders often hold positions of authority and are responsible for making important decisions, leadership is not limited to those in formal leadership positions. Leadership can be demonstrated by anyone who takes the initiative, inspires others and creates positive change, regardless of their official role or title. Some of the most influential leaders do not hold formal leadership positions but still manage to influence others and make a difference. ... True leaders often face uncertain and unpredictable situations and may only sometimes have all the answers. In these situations, it's natural for a leader to feel some degree of uncertainty or doubt. The key difference between a leader and someone who appears confident is that a leader can acknowledge their limitations and vulnerabilities while still maintaining their focus and determination. They are not afraid to ask for help or admit when they don't know something. Leaders who are open and honest about their struggles can inspire greater trust and respect from their team. 


API Gateways: The Doorway to a Microservices World

While microservices are beneficial, they also create significant new challenges. These challenges include:Increased complexity: A microservices architecture introduces additional complexity, as each service needs to communicate with other services through well-defined interfaces. This can result in increased development and management overhead, as well as challenges with testing and debugging. Distributed systems management: A microservices architecture is a distributed system, which means it can be challenging to monitor and manage individual services, especially when there are multiple instances of the same service running in different environments. Data consistency: Maintaining data consistency across multiple services can be challenging, as changes to one service can impact other services that depend on that data. This requires careful planning and management to ensure that data remains consistent and up-to-date across the system.


An open data lakehouse will maintain and grow the value of your data

So here’s how to take advantage of all the data flowing through your organization’s digital transformation pipelines and bring together open-source systems and the cloud to maximize the utility of the data. Use an open data lakehouse designed to meld the best of data warehouses with the best of data lakes. That means storage for any data type, suitable for both data analytics and ML workloads, cost-effective, fast, flexible and with a governance or management layer that provides the reliability, consistency and security needed for enterprise operations. Keeping it “open” (using open-source technologies and standards like PrestoDB, Parquet and Apache HUDI) not only saves money on license costs, but also gives your organization the reassurance that the technology that backs these critical systems is being continuously developed by companies that use it in production and at scale. And as technology advances, so will your infrastructure. Remember, you’ve already invested mightily in data transformation initiatives to remain competitively nimble and power your long-term success.



Quote for the day:

"Leadership matters more in times of uncertainty." -- Wayde Goodall

Daily Tech Digest - March 25, 2023

The Speed Layer Design Pattern for Analytics

In a modern data architecture, speed layers combine batch and real-time processing methods to handle large and fast-moving data sets. The speed layer fills the gap between traditional data warehouses or lakes and streaming tools. It is designed to handle high-velocity data streams that are generated continuously and require immediate processing within the context of integrated historical data to extract insights and drive real-time decision-making. A “speed layer” is an architectural pattern that combines real-time processing with the contextual and historical data of a data warehouse or lake. A speed layer architecture acts as a bridge between data in motion and data at rest, providing a unified view of both real-time and historical data. ... The speed layer must provide a way to query and analyze real-time data in real time, typically using new breakthroughs in query acceleration such as vectorization. In a vectorized query engine, data is stored in fixed size blocks called vectors, and query operations are performed on these vectors in parallel, rather than on individual data elements.


7 steps for implementing security automation in your IT architecture

Security automation is often driven by the need to align with various industry regulations, best practices, and guidelines, as well as internal company policies and procedures. Those requirements, combined with constraints on the human resources available to accomplish them, make automation in this space critical to success. ... NIST defines a vulnerability as a "weakness in an information system, system security procedures, internal controls, or implementation that could be exploited or triggered by a threat source." Vulnerability scanning is the process of leveraging automated tools to uncover potential security issues within a given system, product, application, or network. ... Compliance scanning is the process of leveraging automated tools to uncover misalignment concerning internal and external compliance. The purpose of compliance scanning is to determine and highlight gaps that may exist between legal requirements, industry guidance, and internal policies with the actual implementation of the given entity.


What an IT career will look like in 5 years

“We will see AI usage increase in software development and testing functions shifting the role of these employees” toward higher-level, personal-touch tasks, Huffman says. ... “An augmented workforce experience — across recruiting, productivity, learning, and more — will certainly be something to watch, as the level of trust that we will likely put in our AI colleagues may be surprising,” Bechtel says. “High confidence that AI is delivering the right analytics and insights will be paramount. To build trust, AI algorithms must be visible, auditable, and explainable, and workers must be involved in AI design and output. Organizations are realizing that competitive gains will best be achieved when there is trust in this technology.” Moreover, increased reliance on AI for IT support and development work such as entry-level coding, as well as cloud and system administration will put pressure on IT pros to up their skills in more challenging areas, says Michael Gibbs, CEO and founder of Go Cloud Careers.


Use zero-trust data management to better protect backups

Trust nothing, verify everything. "The principle is to never assume any access request is trustworthy. Never trust, always verify," said Johnny Yu, a research manager at IDC. "Applying [that principle] to data management would mean treating every request to migrate, delete or overwrite data as untrustworthy by default. Applying zero-trust in data management means having practices or technology in place that verify these requests are genuine and authorized before carrying out the request." Data backup software can potentially be accessed by bad actors looking to delete backup data or alter data retention settings. Zero-trust practices use multifactor authentication or role-based access control to help prevent stolen admin credentials or rogue employees from exploiting data backup software. "Zero-trust strategies remove the implicit trust assumptions of castle-and-moat architectures -- meaning that anyone inside the moat is trusted," said Jack Poller, a senior analyst at Enterprise Strategy Group. 


Improving CI/CD Pipelines Through Observability

Overall, observability in a CI pipeline is essential for maintaining the reliability and efficiency of the pipeline and allows developers to quickly identify and resolve any issues that may arise. It can be achieved by using a combination of monitoring, logging, and tracing tools, which can provide real-time visibility into the pipeline and assist with troubleshooting and root cause analysis. In addition to the above, you can also use observability tools such as Application Performance Management (APM) solutions like New Relic or Datadog. APMs provide end-to-end visibility of the entire application and infrastructure, which in turn gives the ability to identify bottlenecks, performance issues, and errors in the pipeline. It is important to note that, observability should be integrated throughout the pipeline, from development to production, to ensure that any issues can be identified and resolved quickly and effectively.


Diffusion models can be contaminated with backdoors, study finds

Chen and his co-authors found that they could easily implant a backdoor in a pre-trained diffusion model with a bit of fine-tuning. With many pre-trained diffusion models available in online ML hubs, putting BadDiffusion to work is both practical and cost-effective. “In some cases, the fine-tuning attack can be successful by training 10 epochs on downstream tasks, which can be accomplished by a single GPU,” said Chen. “The attacker only needs to access a pre-trained model (publicly released checkpoint) and does not need access to the pre-training data.” Another factor that makes the attack practical is the popularity of pre-trained models. To cut costs, many developers prefer to use pre-trained diffusion models instead of training their own from scratch. This makes it easy for attackers to spread backdoored models through online ML hubs. “If the attacker uploads this model to the public, the users won’t be able to tell if a model has backdoors or not by simplifying inspecting their image generation quality,” said Chen.


What is generative AI and its use cases?

Anticipating the AI endgame is an exercise with no end. Imagine a world in which generative technologies link with other nascent innovations, quantum computing, for example. The result is a platform capable of collating and presenting the best collective ideas from human history, plus input from synthetic sources with infinite IQs, in any discipline and for any purpose, in a split second. The results will be presented with recommended action points; but perhaps further down the line the technology will just take care of these while you make a cup of tea. There are several hurdles to leap before this vision becomes reality; for example, dealing with bias and the role of contested opinions, answering the question of whether we really want this, plus, of course, ensuring the safety of humankind, but why not? In the meantime, Rachel Roumeliotis, VP of data and AI at O’Reilly, predicts a host of near-term advantages for logic learning machines (LLMs). “Right now, we are seeing advancement in LLMs outpace how we can use it, as is sometimes the case with medicine, where we find something that works but don’t necessarily know exactly why. 


Iowa to Enact New Data Privacy Law: The Outlook on State and Federal Legislation

The emergence of more data privacy legislation is likely to continue. “It brings the US closer in line with trends we are seeing throughout the world as we have over 160 countries with data protection laws today,” says Dominique Shelton Leipzig, partner, cybersecurity and data privacy at global law firm Mayer Brown. These laws have notable impacts on the companies subject to them and consumers. “For companies, comprehensive privacy laws like these enshrine the existing practices of the privacy profession into law. These laws clarify that our minimum standards for privacy are not just best practices, but legally enforceable by state attorneys general,” says Zweifel-Keegan. While these laws shine a light on data privacy, many critics argue against the “patchwork” approach of state-by-state legislation. “The continuation of the current state-by-state trend means companies are increasingly complying with a complex and evolving patchwork of regulatory requirements. 


Tesla Model 3 Hacked in Less Than 2 Minutes at Pwn2Own Contest

One of the exploits involved executing what is known as a time-of-check-to-time-of-use (TOCTTOU) attack on Tesla's Gateway energy management system. They showed how they could then — among other things — open the front trunk or door of a Tesla Model 3 while the car was in motion. The less than two-minute attack fetched the researchers a new Tesla Model 3 and a cash reward of $100,000. The Tesla vulnerabilities were among a total of 22 zero-day vulnerabilities that researchers from 10 countries uncovered during the first two days of the three-day Pwn2Own contest this week. In the second hack, Synacktiv researchers exploited a heap overflow vulnerability and an out-of-bounds write error in a Bluetooth chipset to break into Tesla's infotainment system and, from there, gain root access to other subsystems. The exploit garnered the researchers an even bigger $250,000 bounty and Pwn2Own's first ever Tier 2 award — a designation the contest organizer reserves for particularly impactful vulnerabilities and exploits.


Leveraging the Power of Digital Twins in Medicine and Business

The digital twins that my team and I develop are high-fidelity, patient-specific virtual models of an individual’s vasculature. This digital representation allows us to use predictive physics-based simulations to assess potential responses to different physiological states or interventions. Clearly, it’s not feasible to try out five different stents in a specific patient surgically. Using a digital twin, however, doctors can test how various interventions would influence that patient and see the outcome before they ever step into the operating room. Patient-specific digital twins allow the doctors to interact with a digital replica of that patient’s coronary anatomy and fine-tune their approach before the intervention itself. The digital twin abstraction allows doctors to assess a wider range of potential scenarios and be more informed in their surgical planning process. Confirming accuracy is a critical component. In validating these models for different use cases, observational data must be measured and used to check the model predictions. 



Quote for the day:

"You don't lead by pointing and telling people some place to go. You lead by going to that place and making a case." -- Ken Kesey