Daily Tech Digest - March 21, 2023

CFO Priorities This Year: Rethinking the Finance Function

Marko Horvat, Gartner VP of research, adds CFOs must transition away from optimization and start thinking about transformation. “Making things faster, more accurate, and with less effort has benefits, but each round of improvement brings diminishing returns,” he says. “CFOs must start thinking about ways to transform the function to build and enhance capabilities, such as advanced data and analytics, in order to truly unlock more value from the finance function.” Sehgal says CFOs should be asking questions including, how do we create a futuristic vision for finance? Should short-term gains override longer-term benefits? And how do we fund digital transformation with the current pressures? “CFOs are focused on elevating the role of finance in the organization to be a value integrator across the enterprise, as well as enhancing value through new strategies that not only support development but that also promote innovations for capital allocation,” he explains.


Build Software Supply Chain Trust with a DevSecOps Platform

When building an application, developers, platform operators and security professionals want to monitor vulnerabilities throughout the software supply chain. The challenge comes when multiple vulnerability scanners are used at different stages in the pipeline and different teams are notified and required to take action without proper coordination. A security-focused application platform can build in scan orchestration to not only detect vulnerabilities but also to map those findings to a workload. This feature allows developers to identify issues throughout the life cycle of their applications and help them resolve issues, shifting left the responsibility with a higher degree of automation. Moreover, the platform can build trust with security analysts by showing the performance of application developers and helping them understand the risk that teams are facing. Once a platform detects these vulnerabilities, both at build time and at runtime, it needs to help developers triage and remediate them. 


Developers, unite! Join the fight for code quality

Writing good code is a craft as much as any other, and should be regarded as such. You have every right to advocate for an environment and an operational model that respect the intricacies of what you do and the significance of the outcome. It’s important to value, and feel valued for, what you do. And not just for your own immediate happiness—it’s also a long-term investment in your career. Making things you don’t think are any good tends to wear on the psyche, which doesn’t exactly feed into a more motivated workday. In fact, a study conducted by Oxford University’s Saïd Business School found that happy workers were 13% more productive. What’s good for your craft is ultimately best for business—a conclusion both engineers and their employers can feel good about. Software plays a big role at just about every level of society—it’s how we create and process information, access goods and services, and entertain ourselves. With the advent of software-defined vehicles, it even determines how we move between physical locations.


Why data literacy matters for business success

Aligning data strategies with overall business strategy and operations is no mean feat. Chief Data Officers (CDOs) are ideal candidates in marrying together data analytics and the wider business, given their appreciation of informed decision making, and the desire to foster a data culture where internal information is properly managed and engaged with throughout the organisation. Moreover, their understanding of the technology landscape will assist when making platform and software selections. This stands to benefit all departments, who’ll gain access to the tools and skills needed to work with data and derive insights. CDOs also embody the “can do” approach to professional development, believing it’s possible to train employees in data-related skills, regardless of their technical proficiency. There’s a well-established correlation between hiring a CDO and business success, with research from Forrester suggesting 89% of organisations harnessing analytics to improve operations that appointed one to oversee the process have seen a positive business impact.


What the 'new automation' means for technology careers

AI is already playing a part in handling technology tasks. A survey released by OpsRamp finds more than 60% of companies adopting AIOps, which applies AI to monitor and improve IT operations themselves. The greatest IT operations challenge for enterprises in 2023 was automating as many operations as possible, cited by 66% of respondents. The main benefits of AIOps seen so far include reduction in open incident tickets (65%); reduction in mean time to detect or restore (56%), and automation of tedious tasks (52%). The latest IT staffing data from Janco Associates finds recent layoffs affected data center and operations staff, with business leaders looking to automate IT processes and reporting. The apparent trend here is that those pursuing careers in technology need to look higher up the stack -- at applications and business consulting. However, there's still a lot of work for people working with the plumbing and code. Unfortunately, getting to automation-driven abstraction -- especially if it involves AI -- requires some manual work up front.


How Cybersecurity Delays Critical Infrastructure Modernization

For critical infrastructure organizations, building a security strategy that works from both an operational technology (OT) and consumer data perspective is not as straightforward as it is in many other industries. Safely storing this data while implementing the latest technology has proved to be a significant challenge across the sector, meaning the service provided by these companies is being hampered. These concerns have prevented a range of technologies from being integrated quickly or at all. These technologies include renewable energy projects, electric vehicle technology, natural disaster contingencies and moving towards smarter grid solutions to replace aging infrastructure. Older operational technology becomes difficult to update and secure sufficiently while the use of third-party software also reduces the level of control organizations have over their data. In addition to this, a lack of automation increases the chances of human error, which could present opportunities to cybercriminals.


What Are Foundation AI Models Exactly?

The generative AI solution can analyze input data against 175 billion parameters and profoundly understand the written language. The smart tool can answer questions, summarize and translate text, produce articles on a given topic, write code, and much more. All you need is to provide ChatGPT with the right prompts. OpenAI’s groundbreaking product is just one example of foundation models that transform AI application development as we know it. Foundation models disrupt AI development as we know it. Instead of training multiple models for separate use cases, you can now leverage a pre-trained AI solution to enhance or fully automate tasks across multiple departments and job functions. With foundation AI models like ChatGPT, companies no longer have to train algorithms from scratch for every task they want to enhance or automate. Instead, you only need to select a foundation model that best fits your use case – and fine-tune its performance for a specific objective you’d like to achieve.


As hiring freezes and layoffs hit, tech teams struggle to do more with less

There are a number of organizational hurdles holding back employees’ learning and development, Pluralsight found. For HR and L&D directors, budget restraints and costs were identified as the biggest barriers to upskilling (30%). This was also true for technology leaders, with 15% blaming financial restraints for getting in the way of employee upskilling. For technology workers themselves, finding time to invest in their own training was identified as the main issue: 42% of workers said they were too busy to upskill, with 18% saying their manager didn’t allow any time during the week to learn new skills. As a result, 21% of tech workers feel pressured to learn outside of work hours. ... However, the report added that giving employees time to invest in their training, address skills gaps and gain valuable growth opportunities are key factors in retention. “Upskilling during work hours will hinder short-term productivity, and managers often bear the brunt of this stress. But don’t sacrifice short-term productivity for long-term success,” the report said.


CISA kicks off ransomware vulnerability pilot to help spot ransomware-exploitable flaws

CISA says it will seek out affected systems using existing services, data sources, technologies, and authorities, including CISA's Cyber Hygiene Vulnerability Scanning. CISA initiated the RVWP by notifying 93 organizations identified as running instances of Microsoft Exchange Service with a vulnerability called "ProxyNotShell," widely exploited by ransomware actors. The agency said this round demonstrated "the effectiveness of this model in enabling timely risk reduction as we further scale the RVWP to additional vulnerabilities and organizations." Eric Goldstein, executive assistant director for cybersecurity at CISA, said, "The RVWP will allow CISA to provide timely and actionable information that will directly reduce the prevalence of damaging ransomware incidents affecting American organizations. We encourage every organization to urgently mitigate vulnerabilities identified by this program and adopt strong security measures consistent with the U.S. government's guidance on StopRansomware.gov."


A Simple Framework for Architectural Decisions

Technology Radar captures techniques, platforms, tools, languages and frameworks, and their level of adoption across an organization. However, this may not cover all the needs. Establishing consistent practices for things that apply across different parts of the system can be helpful. For example, you might want to ensure all logging is done in the same format and with the same information included. Or, if you’re using a REST API, you might want to establish some conventions around how it should be designed and used, like what headers to use or how to name things. Additionally, if you’re using multiple similar technologies, it can be useful to guide when to use each one. Technology Standards define the rules for selecting and using technologies within your company. They ensure consistency, reduce the risk of adopting new technology in a suboptimal way, and drive consistency across the organization.



Quote for the day:

"Leadership is not about titles, positions, or flow charts. It is about one life influencing another." -- John C. Maxwell

Daily Tech Digest - March 20, 2023

The Rise of the BISO in Contemporary Cybersecurity

In general, “A BISO is assigned to provide security leadership for one particular business unit, group, or team within the greater organization,” explains Andrew Hay, COO at Lares Consulting. “Using a BISO divides responsibility in large companies, and we often see the BISOs reporting up to the central CISO for the organization.” “A BISO is responsible for establishing or implementing security policies and strategies within a line of business,” adds Timothy Morris, chief security advisor at Tanium. “Before the BISO role became popular, other director-level roles performed similar functions in larger organizations as an information security leader.” The precise role of the BISO varies from company to company depending on the needs of that company. “In some cases, the BISO will hold a senior position reporting directly to the CISO, CTO, or CIO,” explains Kurt Manske, managing principal for strategy, privacy, and risk at Coalfire. “At this level, the BISO acts as a liaison with business unit leaders and executives to promote a strong information security posture across the organization.”


CEO directives: Top 5 initiatives for IT leaders

Cybersecurity became a bigger issue this year for Josh Hamit, senior VP and CIO at Altra Federal Credit Union, due in part to Russia’s invasion of Ukraine, which touched off warnings about possible Russia-backed hackers stepping up cyberattacks on US targets. As a result, Hamit has brought extra attention to partnering with Altra’s CISO to perfect security fundamentals, cyber hygiene and best practices, and layered defenses. More likely cyber scenarios have IT leaders increasingly concerned as well. For instance, three out of four global businesses expect an email-borne attack will have serious consequences for their organization in the coming year, according to CSO Online’s State of Email Security report. Hybrid work has led to more email (82% of companies report a higher volume of email in 2022) and that has incentivized threat actors to steal data through a proliferation of social engineering attacks, shifting their focus from targeting the enterprise network itself to capitalizing on the vulnerable behaviors of individual employees.


Breach Roundup: Med Devices, Hospitals and a Death Registry

A vulnerability the Indian government at first said did not exist it now says is fixed. The Indian Ministry of Railways in December denied that the data of 30 million people allegedly on sale on the dark net came from a hacker breaching Rail Yatri, the official app of Indian Railways. On Wednesday, Minister of State for Electronics and Information Technology Rajeev Chandrasekhar said the Indian Railway Catering and Tourism Corp. fixed the issue and took necessary precautions to prevent its recurrence. Neither Rail Yatri nor the minister disclosed the penalty paid for the incident. ... A February data breach of the U.S. Marshals Service systems, which led to hackers maliciously encrypting systems and exfiltrating sensitive data law enforcement data, got worse. A threat actor is reportedly selling 350 gigabytes of data allegedly stolen from the servers for $150,000 on a Russian-speaking hacking forum. The data on sale allegedly includes "documents from file servers and work computers from 2021 to February 2023, without flooding like exe files and libraries," reported Bleeping Computer. 


BianLian ransomware group shifts focus to extortion

Researchers observed that the speed at which BianLian posts the masked details has also increased over time. If one is to accept the date of compromise listed by BianLian as accurate, the group averages just ten days from an initial compromise to ratcheting up the pressure on a victim by posting masked details. In some instances, BianLian appears to have posted masked details within 48 hours of a compromise, Redacted said in its report. “With this shift in tactics, a more reliable leak site, and an increase in the speed of leaking victim data, it appears that the previous underlying issues of BianLian’s inability to run the business side of a ransomware campaign appear to have been addressed,” Redacted said, adding that these improvements are likely the result of gaining more experience through their successful compromise of victim organizations. The BianLian group appears to bring close to 30 new command-and-control (C2) servers online each month. In the first half of March, the group has already brought 11 new C2 servers online. The average lifespan of a server is approximately two weeks, Redacted said.


CIOs Must Make Call on AI-Based App Development

Erlihson says other key stakeholders necessary for an AI-based app development strategy include the chief data officer (CDO), who can help manage and govern the organization’s data assets, ensure data quality, and make sure that data is used in compliance with regulations. The chief financial officer (CFO) can ensure that the organization’s investments in AI-based tools are aligned with the financial objectives and overall budget of the company. “It's also important to include business leaders to identify business problems that can be solved by AI, providing use cases, and setting priorities for AI-based app development based on business needs,” he says. Legal and compliance must also be involved to ensure AI-based tools are compliant with data privacy laws and regulations, security, and ethical use of AI. “Finally, operations and IT teams are needed to provide feedback on the feasibility and scalability of AI-based tool development and deployment and to assure that the necessary IT infrastructure required to support AI-based app deployment is in place,” Erlihson says.


How Design Thinking and Improved User Experiences Contribute to Customer Success

Everything is about the needs, preferences and behaviors of users and the frustrations they sometimes face, with a continuous feedback loop used for perpetual reporting. The model emphasizes the need for diverse voices, experimentation with new ways of working, rapid prototyping and iteration, as well as a commitment to constantly improving the quality of service. As an example of experimentation, Airbnb unlocked growth by using professional photography to replace poor-quality images advertising property rentals in New York and saw an instant uptick. Done right, it has the benefit of challenging developer assumptions and management status quo. It helps to mitigate against the narrative of ‘we’ve always done it this way’ or the temptation to ‘bloatware’ which adds pointless features and functions. Despite the name, Design Thinking doesn’t just impact software user experience design; product managers and others are also involved to create a holistic understanding of what is happening. 


Sovereign clouds are becoming a big deal again

Although sovereign clouds aim to increase data privacy and sovereignty, there are concerns that governments could use them to collect and monitor citizens’ data, potentially violating privacy rights. Many companies prefer to use global public cloud providers if they believe that their local sovereign cloud could be compromised by the government. Keep in mind that in many cases, the local governments own the sovereign clouds. Sovereign clouds may be slower to adopt new technologies and services compared to global cloud providers, which could limit their ability to innovate and remain competitive. Consider the current artificial intelligence boom. Sovereign clouds won’t likely be able to offer the same types of services, considering that they don’t have billions to spend on R&D like the larger providers. Organizations that rely on a sovereign cloud may become overly dependent on the government or consortium operating it, limiting their flexibility and autonomy. As multicloud becomes a more popular architecture, I suspect the use of sovereign clouds will become more common. 


Why You Need a Plan for Ongoing Unstructured Data Mobility

Most organizations keep all or most of their data indefinitely, but as data ages, its value changes. Some data becomes “cold” or infrequently accessed or not needed after 30 days yet must be retained for a period of time for regulatory or compliance reasons; some data should be deleted; and some data may be required for research or analytics purposes later. ... Ensuring easy mobility for the data as it ages and understanding the best options for different data segments is paramount. Another reason why unstructured data mobility is imperative is due to growing AI and machine learning adoption. Once data is no longer in active use, it has the potential for a second or third life in big data analytics programs. You might migrate some data to a low-cost cloud tier for archival purposes but IT or other departments with the right permissions should be able to easily discover it later and move it to a cloud data lake or AI tool when needed for many different use cases.
 

Microsoft: 365 Copilot chatbot is the AI-based future of work

Microsoft CEO Satya Nadella said the new 365 Copilot chatbot will “radically transform how computers help us think, plan and act. “Just as we can’t imagine computing today without a keypad, mouse or multitouch, going forward we won’t be able to imagine computing without copilots and natural language prompts that intuitively help us with continuation, summarization, chain-of-thought reasoning, reviewing, modifying and acting,” he said. Copilot combines a large language model (LLM) with the 365 suite and the user data contained therein. Through the use of a chatbot interface and natural language processing, users can ask questions of Copilot and receive human-like responses, summarize online chats, and generate business products. Copilot in Word, for example, can jump-start the creative process by giving a user a first draft to edit and iterate on — saving hours in writing, sourcing, and editing time, Microsoft said in a blog post. "Sometimes Copilot will be right, other times usefully wrong — but it will always put you further ahead,"


How CISOs Can Start Talking About ChatGPT

Do we have the right oversight structures in place? The fundamental challenge with AI is governance. From the highest levels, your company needs to devise a system that manages how AI is studied, developed and used within the enterprise. For example, does the board want to embrace AI swiftly and fully, to explore new products and markets? If so, the board should designate a risk or technology committee of some kind to receive regular reports about how the company is using AI. On the other hand, if the board wants to be cautious with AI and its potential to up-end your business objectives, then perhaps it could make do with reports about AI only as necessary, while an in-house risk committee tinkers with AI’s risks and opportunities. Whatever path you choose, senior management and the board must establish some sort of governance over AI’s use and development. Otherwise employees will proceed on their own – and the risks only proliferate from there.



Quote for the day:

"Humility is a great quality of leadership which derives respect and not just fear or hatred." -- Yousef Munayyer

Daily Tech Digest - March 17, 2023

6 principles for building engaged security governance

No governance strategy can be built without knowing where the organization is currently and where it is going. Start by understanding the organization's core business practices, its product portfolio, customers, geographical footprint, and ethos and culture -- all from a security perspective. This should help answer key security-related questions, such as who does what, why they do it and for whom. Next gain a better understanding of the organizational structure and current security standards, guidelines, regulations and frameworks. Get a better grasp of how security functions operate. Take a comprehensive review of the security policies in place and how effective they are. Understand the current state of security procedures, projects and activities, tests and exercises as well as the current level of information security controls and future roadmap. Assess the skills and capabilities of security practitioners and their responsibilities, and benchmark it with best practices in the industry to expose the gaps in existing capabilities and activities.


Cyber attribution: Vigilance or distraction?

In some situations, effective attribution can be a valuable source of intelligence for organizations that suffered a cyber breach. Threat actors go to great lengths to cover their tracks, and any evidence and facts gathered through attribution can bring organizations closer to catching the perpetrators. Deploying a good Cyber Threat Intelligence (CTI) program helps organizations understand which current or future threats can impact their business operations. Some organizations don’t treat threat intelligence seriously because they already have their “go-to” to blame, or they simply believe no one will attack a small organization. During the Wannacry attack, we witnessed a prime example of a poor interpretation of threat intelligence, when organizations and ISPs started blocking access to a sinkhole URL discovered by security researcher Marcus Hutchins. Rather than being a malicious website, devices connecting to the URL prevented the malware’s payload from activating, so blocking it resulted in further infections.


A Comprehensive Checklist For IoT Project Success

IoT projects often require a variety of specialized knowledge and skills, from edge/gateway device knowledge, to networking, to cloud platform knowledge, to security and real-time dashboard displays. Building a team with the appropriate expertise to manage this technology and other necessary skills helps ensure that the project is designed, developed and implemented successfully. If you do not have team members on hand with the necessary expertise, consider outsourcing some or all of the work to third-party IoT experts. ... IoT systems often need to handle vast amounts of data and potentially millions of devices at once, and planning for scalability ensures that the system’s architecture can handle the expected load (network coverage, reliability, bandwidth, latency, etc.) at the edge and gateway device and cloud platform level (e.g., end user dashboards, alerts, and more). Best practices here would include designing for modularity and flexibility, using scalable technologies, implementing caching and load balancing, and generally planning for growth.


Principles for Adopting Microservices Successfully

While microservices architecture has many benefits, it also introduces new challenges and potential pitfalls. One common pitfall to avoid is creating too many or too few services. Creating too many services can lead to unnecessary complexity while creating too few services can make it difficult to maintain and scale the application. It is important to strike a balance by breaking the application into small, independent services that are focused on specific business functions. Another common pitfall is not considering the operational overhead of microservices. ... It is important to ensure the team has the necessary skills and resources to manage a microservices architecture, including monitoring, debugging, and deploying services. Finally, it is important to avoid creating overly coupled services. Services that are tightly coupled can create dependencies and make it difficult to make changes to the application. It is important to design services with loose coupling in mind, ensuring that each service is independent and can be modified or replaced without affecting other services.


Why Audit Logs Are Important

Audit logs have a different purpose and intended audience when compared to the system logs written by your application’s code. Whereas those logs are usually designed to help developers debug unexpected technical errors, audit logs are primarily a compliance control for monitoring your system’s operation. They’re helpful to regulatory teams, system administrators and security practitioners who need to check that correct processes are being followed. Audit logs also differ in the way they’re stored and retained. They’re usually stored for much longer periods than application logs, which are unlikely to be kept after an issue is solved. Because audit logs are a historical record, they could be retained indefinitely if you have the storage available. ... The information provided by an audit log entry will vary depending on whether the record relates to authentication or authorization. If it’s an authentication attempt, the request will have occurred in the context of a public session. Authorization logs, written by AuthZ services like Cerbos, will include the identity of the logged-in user.


Quantum Computing Is the Future, and Schools Need to Catch Up

Thankfully, things are starting to change. Universities are exposing students sooner to once-feared quantum mechanics courses. Students are also learning through less-traditional means, like YouTube channels or online courses, and seeking out open-source communities to begin their quantum journeys. And it’s about time, as demand is skyrocketing for quantum-savvy scientists, software developers and even business majors to fill a pipeline of scientific talent. We can’t keep waiting six or more years for every one of those students to receive a Ph.D., which is the norm in the field right now. Schools are finally responding to this need. Some universities are offering non-Ph.D. programs in quantum computing, for example. In recent years, Wisconsin and the University of California, Los Angeles, have welcomed inaugural classes of quantum information masters’ degree students into intensive year-long programs. U.C.L.A. ended up bringing in a much larger cohort than the university anticipated, demonstrating student demand. 


Forget the hybrid cloud; it’s time for the confidential cloud

“The use cases are expanding rapidly, particularly at the edge, because as people start doing AI and machine learning processing at the edge for all kinds of reasons [such as autonomous vehicles, surveillance infrastructure management], this activity has remained outside of the security perimeter of the cloud,” said Lavender. The traditional cloud security perimeter is based on the idea of encrypting data-at-rest in storage and as it transits across a network, which makes it difficult to conduct tasks like AI inferencing at the network’s edge. This is because there’s no way to prevent information from being exposed during processing. “As the data there becomes more sensitive — particularly video data, which could have PII information like your face or your driver’s [license] or your car license [plate] number — there’s a whole new level of privacy that intersects with confidential computing that needs to be maintained with these machine learning algorithms doing inferencing,” said Lavender.


DevOps and Hybrid Cloud: A Q+A With Rosalind Radcliffe

The hardest thing to change in any transformation is the culture. The same is true in the IBM CIO office. Both new and experienced developers are learning from each other in a non-penalty environment. Although we have a full hybrid cloud and systems running in lots of places, the reality is I would like to run as much as appropriate on IBM Z from a security and availability or always-on standpoint. I can keep things more available on the platform because of the hardware stability in addition to all the agile capabilities that I can exploit. Meanwhile, I continually chip away at the fears and the misconceptions about development on z/OS. One way to start is by removing that fear and making it simpler and more accessible. Encouraging people to play with z/OS in IBM Cloud with Wazi as a Service, allowing them to experiment, understand and learn in a penalty free environment. They have the freedom to know they are not breaking an existing system or impacting production.


Cyberattackers Continue Assault Against Fortinet Devices

Fortinet described the attack on its customers' devices in some detail in its advisory. The attackers had used the vulnerability to modify the device firmware and add a new firmware file. The attackers gained access to the FortiGate devices via the FortiManager software and modified the devices' start-up script to maintain persistence. The malicious firmware could have allowed for data exfiltration, the reading and writing of files, or given the attacker a remote shell, depending on the command the software received from the command-and-control (C2) server, Fortinet stated. More than a half dozen other files were modified as well. The incident analysis, however, lacked several critical pieces of information, such as how the attackers gained privileged access to the FortiManager software and the date of the attack, among other details. When contacted, the company issued a statement in response to an interview request: "We published a PSIRT advisory (FG-IR-22-369) on March 7 that details recommended next steps regarding CVE-2022-41328," the company said. 


3 ways layoffs will impact IT jobs in 2023

With no oversight, shadow IT services and tools increase risk and vulnerability to attack, or more commonly, poor security hygiene. With mounting to-do lists and more projects than ever, overworked IT teams may default to rubber-stamping access in the name of productivity. But failure to properly govern identities within the organization can lead to a chain reaction of regulatory and budgetary compliance missteps. Automation tools can help ease identity governance worries internally, but IT teams should still be cognizant of what’s being used by employees externally and the business risks they pose. In the case of shadow IT, too many tech tools and services can be seen as the problem. But in many ways, they can also serve as a solution. The right technology can go a long way in alleviating common IT burdens – the key is choosing solutions that work well within a company’s existing technology stack. Advocating for software that is easy for employees to use and integrate will go a long way. 



Quote for the day:

"All leaders ask questions, keep promises, hold themselves accountable and atone for their mistakes." -- James Kouzes and Barry Posner

Daily Tech Digest - March 15, 2023

Critical Thinking: The Overlooked IT Management Skill

Critical thinking is essential to IT leadership because leaders sit at the conjunction of four distinct worlds: data, technology, business processes, and people. “An IT leader has to think logically about how to integrate these four distinct capabilities into a reasonable response to organizational challenges,” says Michael Williams, an associate professor of information systems technology management at Pepperdine University. ... Solutions to complex IT problems are not black and white, observes Sydney Buchel, a senior consultant with cybersecurity and compliance frim BARR Advisory. “IT environments vary, based on complexity, size, the data being processed, unique risks, and integrations with other platforms,” she says. “This means that making key decisions for your IT environment cannot be impulsive -- it requires gathering, analyzing, and conceptualizing information to ensure thoughtful decision-making.” ... Critical thinking requires time and practice. “Collect all relevant information: evidence, facts, research, and perspectives from trusted colleagues, as well as your own thoughts and experiences,” Buchel advises.


Don’t do IT yourself: The trick to ensuring business alignment

There are many ways to implement an ITSC, but to start, you and your committee members must work together continuously on a strategic plan until the planning group can finalize an immediate one-year plan that is approved by all departments. Then each department must take this one-year plan and decide what resources are needed to accomplish their objectives — a process that should include conversations with IT to help determine what is needed. These requirements should then be examined by IT and senior department heads to determine initial time and cost estimates, along with expected ROI, which should be the responsibility of the requesting department. With these plans now submitted to the ITSC, the committee, which remember is composed of all direct reports of the CEO or the COO to ensure all departments are included, then determines whether the systems requested do indeed represent what the company needs and at the speed they are needed. By doing so, the committee can determine whether staffing is sufficient or if additional resources are required to accomplish planned goals.


The importance of software testing for Digital Transformation

IT costs represent significant overheads and budget cuts proliferate. With smaller teams and fewer resources, teams have to do more with less when it comes to software development in order to keep pace with Digital Transformation and customer demand. Speed is a must – but if the customer experience is to be protected, then quality must also be prioritized. Often test automation is a late addition to the Digital Transformation process, but this causes many risks and challenges along the way. As enterprise organizations develop applications that rely on continuous software updates, automating testing is a crucial element to increasing release speeds and improving application quality, helping the organization run more efficiently to meet its bottom line. Test automation gives organizations the ability to monitor and assess risk in real time or even prevent issues before they occur. By adopting this real-time or and a pre-emptive approach, major disruptions which can impact everything from productivity to customer experience or revenue, can be staved off.


Bias Busters: The perils of executive typecasting

A common obstacle to good decision making is executives’ adherence to role theory, a concept in sociology and psychology that suggests that most people categorize themselves and others according to socially defined roles—as a parent, a manager, or a teacher, for instance. They adopt norms associated with designated roles, behave accordingly, and, in a form of groupthink, expect others to do the same. ... Organizations must actively encourage dissent and make it safe for individuals at all levels, regardless of role, to share contrarian ideas. In this case, if the CFO could separate her idea to divest from her status in the organization, she might get a fairer shake from everyone involved. One way to do that would be to engage individuals and teams in a “what you have to believe” assessment, highlighting the discrepancies between the product line’s current performance and the resources needed to bring it back to premier status. Such an assessment could put more facts into and structure around strategy discussions.


Why Asia is moving to multi-cloud

So, what is it that attracts Asia Pacific (APAC) customers to his company’s suite of products? Orchard highlighted two reasons that he consistently hears from APAC firms that adopt his organization’s products: a broad ecosystem, and the ability to bridge the cloud skills gap. “With the breadth of technologies in use by our customers across both traditional data center vendors, public clouds, and SaaS providers, they need a vendor whose focus is on the ecosystem. And with over 2,600 providers for Terraform, we fit that bill better than any other vendor in the industry,” he told DCD. In addition, Orchard says standardization through the HashiCorp Configuration Language (HCL) language used to configure its solutions using code can help address the ongoing skills shortage in cloud professionals. Indeed, HCL as used by Terraform was lauded in GitHub’s latest State of the Octoverse report as the fastest-growing language on GitHub. Though the focus of infrastructure-as-code is on provisioning infrastructure, there are secondary benefits to organizations.


Navigating The AI Minefield: HR Grapples With Bias & Privacy Concerns

To guard against bias and potential liability, employers must provide notice and obtain consent from applicants and employees before using AI tools. This includes explaining what type of tool is being used and providing enough information for individuals to understand the criteria being evaluated. Employers must also be mindful of the way their AI tools can affect people with disabilities and be prepared to provide reasonable accommodations. By taking these steps, employers can harness the power of AI while avoiding bias and promoting diversity and inclusion in the workplace. Artificial intelligence (AI) tools, though designed to help employers make better decisions, could be causing more harm than good due to algorithmic bias. The problem arises when the AI tool is developed and monitored incorrectly, leading to bias against certain demographic groups, according to experts in the field. ... Additionally, AI tools conducting background checks can pull data from a much broader area than traditional selection tools, such as social media, and trigger data privacy concerns as well. 


DNS data shows one in 10 organizations have malware traffic on their networks

More than a quarter of that traffic went to servers belonging to initial access brokers, attackers who sell access into corporate networks to other cybercriminals, the report stated. “As we analyzed malicious DNS traffic of both enterprise and home users, we were able to spot several outbreaks and campaigns in the process, such as the spread of FluBot, an Android-based malware moving from country to country around the world, as well as the prevalence of various cybercriminal groups aimed at enterprises,” Akamai said. “Perhaps the best example is the significant presence of C2 traffic related to initial access brokers (IABs) that breach corporate networks and monetize access by peddling it to others, such as ransomware as a service (RaaS) groups.” Akamai operates a large DNS infrastructure for its global CDN and other cloud and security services and is able to observe up to seven trillion DNS requests per day. Since DNS queries attempt to resolve the IP address of a domain name, Akamai can map requests that originate from corporate networks or home users to known malicious domains


Heart Device Maker Says Hack Affected 1 Million Patients

The incident illustrates how deeply networked connectivity has penetrated the medical device market, a development that has created new opportunities for hackers to steal personal information in an industry historically unaccustomed to fending off threat actors. Information potentially disclosed in the cybersecurity incident includes individuals' names, addresses, birthdates and Social Security numbers. "It may also be inferred that you used or were considered for use of a Zoll product," the company says in a sample breach notification letter. "More and more medical devices are becoming connected to the network and internet and, in almost all cases, the manufacturer is gaining access to device and patient information," said security researcher Jason Sinchak, who leads cybersecurity firm Level Nine's medical device product security practice. "What was previously an embedded medical device manufacturing organization becomes a software-as-a-service and managed service organization," he said.


How machine learning is changing the way businesses think about customer behavior

“It’s crucial to ensure the emotional data captured accurately reflects the inner feelings of the customers rather than just their expressed emotions through specific keywords or loud expressions. Systems based on nonrelevant data are likely to result in a waste of time and money and will probably have lower success rates compared to systems that incorporate genuine emotion detection and personality assessment,” he adds. This new frontier of so-called Emotional Intelligence-as-a-Service has the potential to significantly impact customer behavior by providing valuable insights into their preferences and motivations. By personalizing the customer experience based on this real-time emotional intelligence, organizations can improve customer satisfaction and assist customers in achieving their goals more effectively.‌ ... “In these new virtual environments, just imagine the impact of a virtual agent in a new Web3 or metaverse world that has its own unique personality and style and can truly understand yours,” Liberman says.


The philosopher: A conversation with Grady Booch

The story of computing is the story of humanity. This is a story of ambition, invention, creativity, vision, avarice, and serendipity, all powered by a refusal to accept the limits of our bodies and our minds. As we co-evolve with computing, the best of us and the worst of us is amplified, and along the way, we are challenged as to what it means to be intelligent, to be creative, to be conscious. We are on a journey to build computers in our own image, and that means we have to not only understand the essence of who we are, but we must also consider what makes us different. ... The field of artificial intelligence has seen a number of vibrant springs and dismal winters over the years, but this time it seems different: there are a multitude of economically-interesting use cases that are fueling the field, and so in the coming years we will see these advances weave themselves into our world. Indeed, AI already has: every time we take a photograph, search for a product to buy, interact with some computerized appliance, we are likely using AI in one way or another.



Quote for the day:

"Leaders are the ones who keep faith with the past, keep step with the present, and keep the promise to posterity." -- Harold J. Seymour

Daily Tech Digest - March 14, 2023

Should There Be Enforceable Ethics Regulations on Generative AI?

Companies might claim they conduct ethical uses of AI, she says, but more could be done. For example, Rudin says companies tend to claim that putting limits on speech that contributes to human trafficking or vaccine misinformation would also eliminate content that the public would not want removed, such as critiques of hate speech or retellings of someone’s experiences confronting bias and prejudice. “Basically, what the companies are saying is that they can’t create a classifier, like they’re incapable of creating a classifier that will accurately identify misinformation,” she says. “Frankly, I don’t believe that. These companies are good enough at machine learning that they should be able to identify what substance is real and what substance is not. And if they can’t, they should put more resources behind that.” Rudin’s top concerns about AI include circulation of misinformation, ChatGPT putting to work helping terrorist groups using social media to recruit and fundraise, and facial recognition being paired with pervasive surveillance. 


6 reasons why your anti-phishing strategy isn’t working

Many organizations are trying to solve the phishing problem solely with technology. These companies are buying all the latest tools to detect suspicious emails and giving employees a way to report then block those suspicious emails, says Eric Liebowitz, chief information security officer, Americas at Thales Group, a Paris-based company that develops devices, equipment, and technology for the aerospace, space, transportation, defense, and security industries. While doing that is great, in the end the bad actors are always going to be more sophisticated, he says. “One of the big things that I don't think enough organizations are focusing on is training their employees,” Liebowitz says. “They could have all the greatest tools in place, but if they're not training their employees, that's when the bad thing is going to happen.” While some organizations have deployed the right tools and have workflows and processes in place to combat phishing campaigns, they haven't adequately and proactively configured those tools, says Justin Haney, executive, North America security lead at Avanade.


Observability and Monitoring in the DevOps Age

The first step towards achieving success is to know what to measure and monitor. Your business technology ecosystem may be comprised of different modular applications with all sorts of possible dependencies. It is important to first lay out the key indicators that must be tracked if engineers are to find remedies when unusual behavior is observed. These indicators are not just internal operational metrics but also customer-facing ones like performance and speed of page loads, erroneous crashes of web interfaces, etc. The key to finding the best remedy for any unexpected defect or bug is to trace the root cause of the problem. This means developers and QA engineers must be able to navigate the exact workflow that resulted in a defective output. For this, traceability is an essential factor in every transactional workflow. It helps DevOps teams understand how data and insights are passed between different systems when a transactional request is processed. 


The importance of optimising IoT for retailers

Innovation has radically altered the retail customer experience, with e-commerce and brick-and-mortar stores redefining the way the world shops. By supporting customers, interactive digital terminals, virtual and augmented reality tools, and robotic sales assistants have increased both business efficiency and customer satisfaction. Furthermore, retailers have implemented IoT technologies to optimize existing processes. Radio-frequency identification (RFID) tracking is utilized to streamline warehouse inventory processes, enabling efficient asset management. Once consumers venture to stores, cameras and sensors are employed for footfall tracking purposes, and Wi-Fi connections can detect repeat customers and target them with digital advertisements beyond their trip to the store. While these modifications have improved the retail experience for customers, they may also have increased network traffic and masked visibility for IT teams, complicating operations and performance management.


3 ways data teams can avoid a tragedy of the cloud commons

Industry analysts estimate that at least 30% of cloud spend is “wasted” each year — some $17.6 billion. For modern data pipelines in the cloud, the percentage of waste is significantly higher, estimated at closer to 50%. It’s not hard to understand how we got here. Public cloud services like AWS and GCP have made it easy to spin resources up and down at will, as they’re needed. Having unfettered access to a “limitless” pool of computing resources has truly transformed how businesses create new products and services and bring them to market. For modern data teams, this “democratization of IT” facilitated by the public cloud has been a game-changer. For one thing, it’s enabled them to be far more agile as they don’t need to negotiate and justify a business case with the IT department to buy or repurpose a server in the corporate data center. And as an operational expenditure, the pay-by-the-drip model of the cloud makes budget planning seem more flexible. However, the ease with which we can spin up a cloud instance doesn’t come without a few unintentional consequences — forgotten workloads, over-provisioned or underutilized resources — with results including spiraling and unpredictable costs.


3 Reasons Women Should Reskill to Work in Cybersecurity

According to an (ISC)² study, women make up roughly a quarter of the overall cybersecurity workforce. We’ve come a long way over the last decade (women made up about 10% of cybersecurity jobs in 2013), but we know the industry needs to work toward even greater diversity. Addressing the gender gap starts with sparking interest at a young age. We can also get creative with our most passionate and loyal current employees and realize not every cybersecurity role is a ‘special snowflake.’ There are many open roles that call for in-depth skills that have been honed and developed over time. What about all the roles that don’t? Here’s the secret: Not everyone who works in cybersecurity needs to be a cybersecurity expert. At least not right away. Cybersecurity expertise can be taught or learned. So, one way to get closer to bridging the talent gap is to reskill talent from other professions.


How Aerospike Document Database supports real-time applications

A real-time document database should have an underlying data platform that provides quick ingest, efficient storage, and powerful queries while delivering fast response times. The Aerospike Document Database offers these capabilities at previously unattainable scales. JSON, a format for storing and transporting data, has passed XML to become the de facto data model for the web and is commonly used in document databases. The Aerospike Document Database lets developers ingest, store, and process JSON document data as Collection Data Types (CDTs)—flexible, schema-free containers that provide the ability to model, organize, and query a large JSON document store. The CDT API models JSON documents by facilitating list and map operations within objects. The resulting aggregate CDT structures are stored and transferred using the binary MessagePack format. This highly efficient approach reduces client-side computation and network costs and adds minimal overhead to read and write calls.


Taming the Data Mess, How Not to Be Overwhelmed by the Data Landscape

One common thing that happens is that people get pressed a little bit because of marketing or because of buzz of the peers about what is the next nice thing everybody's using? Sometimes this is not appropriate for the problems you have. This is something that happened to me with a friend a long time ago, who asked me, "You see there is these companies. Now we're hearing a lot about this thing called feature stores, and we want to do machine learning in our company. This is a little startup. Maybe I need a feature store to this." That's not the right way to approach this problem. The first question I asked was, do you already have data available and is this data clean and of good quality? Do you have processes to keep it like that? Because it's not only a one-time shot, you have to keep these things going on. More important, is that data already used in your business? Do you have reports, or things like that? This is all the steps that were pretty important, even before thinking about machine learning and stuff like that.


4 data challenges for CIOs

CIOs are becoming more aware that they cannot afford to have hidden stores of data or data that exists in siloes. Siloed data may be unknown to the teams that can derive the most value from it, which undercuts its value. And if there are data stores that are completely unknown to a company, there’s no way to protect them. Question your past assumptions about how and where your organization stores and processes data. The results of an end-to-end data discovery process across the network will bring to light previously unknown security issues. To better understand the flow of data, open lines of communication throughout the company. Meet with people at all levels, including the CISO, security manager, operations manager, IT service manager, and individual IT staff, to ensure everyone’s data usage and goals align. Ask employees their perspective on what’s working, what changes must be addressed, and where data blind spots may be. Involving all departments responsible for data improves knowledge and skills and ensures a stronger data strategy overall.
With sustainability gaining importance to companies, the CFO’s value-creation mandate calls for a new way of thinking and making decisions—one that uses sustainability information to guide choices about how to deliver strong performance for investors. Working with the CEO and governing board, today’s CFOs are integral to the development of a sustainable business model, bringing financial and nonfinancial data to bear on a company’s strategic goals, its resource and capital allocation, and its measures of long- and short-term performance. ... In bringing sustainability factors into business decisions, CFOs must also pay attention to the specific priorities of each region where their organization has a presence. In Asia, for example, the United Nations’ Sustainable Development Goals have identified both greater developmental needs than in other regions and greater susceptibility to climate risk. CFOs there need to consider how to be ambitious about decarbonization in a way that is also considerate of local communities, whose interests may differ from those of communities in other regions. 



Quote for the day:

"People buy into the leader before they buy into the vision." -- John C. Maxwell

Daily Tech Digest - March 13, 2023

CFO Cybersecurity Strategies: How to Protect Against the Rising Storm

Think of cybersecurity as an investment in resiliency. Taking a comprehensive approach to cybersecurity increases the odds that your organization will not only identify malicious activity and successfully deflect attackers, but also respond effectively and recover with minimal impact if a worst-case scenario unfolds. However, you need to proactively validate that your company’s approach is truly comprehensive. Historically, cybersecurity has assumed the purview of IT, while the reality of cybersecurity is much more complex and pervasive. While IT can manage and solve many risks, every leader in an organization has a role to play, from governance, legal, compliance, public relations, human resources, etc. So does every third party including your vendors, suppliers, contractors, service providers, and customers. So, it’s not only about technology, but people and processes as well. Simply put, cybersecurity is like a tree with a complex root system.


The problem with development speed

Less code, but more impact. That’s the formula for success. But it’s not what many development teams do. For too many, as Gilad details, “a product group that has two-thirds of the output really [can] create four times the impact.” The key, he stresses, is that “most of what we create is a waste, [so] chasing output is actually creating more waste, faster.” All of which sounds great, but telling developers to “do more good things and fewer bad things,” is hardly actionable. The trick, Gilad outlines, is to introduce more research and testing earlier in the development process, coupled with a willingness to scrap incomplete projects that aren’t on track for success. It’s not that developers will sit around thinking about success but not shipping. Rather, “you should increase throughput, but not of launches.” Instead, focus on running more “tests and experiments.” By doing so, you’ll end up with fewer projects but ones with a higher impact. This willingness to shed bad code early can make a huge difference.


Tapping AI to Alleviate Cloud Spend

Vladimirskiy says CIOs and executives overseeing an organization’s IT strategy are responsible for evaluating and implementing effective AI-based cloud optimization solutions. Because the efficacy of an AI-based cloud optimization system is based on how well-trained the model responsible for managing the corresponding workload is, it’s not advisable for organizations to start from scratch. “Vendors who focus specifically on this type of optimization will have access to more in-depth data across multiple organizations to train these models and ultimately create successful AI cloud optimization solutions,” he says. Diaz agrees the key stakeholders when it comes to implementing AI to manage cloud spending and control costs are primarily IT management, but finance plays a key role. ... Finance is involved as the final stop when it comes to paying for cloud resources, controlling what portion of the organization’s budget goes into both the cloud resources, and the AI technology used to help manage the cloud.


Contract-Driven Development – A Real-World Adoption Journey

In our search for an alternate solution, we wanted to try contract-driven development because it seemed to satisfy our initial criteria.Parallel development of provider and consumer applications API specification is the API contract (instead of two separate artifacts). An automated technique other than a code generation-based technique to ensure providers were adhering to contract. More emphasis on API design and promoting collaboration among teams in the process However, the teams were also skeptical about contract-driven development because this again involved API specifications and contract testing, both of which they had already tried and had seen a low return on investment. However, addressing these concerns was a great starting point for us to get the teams started on contract-driven development. We felt the most convincing way of achieving this would be through a real-world example in their context and taking it to production. ... To gain more buy-in, we set out to introduce contract-driven development to just a handful of teams working on a feature that cut across two or three microservices and frontend components.


The importance of measurement in closing the sustainability gap

Good engineering practice, such as edge-caching, optimised data storage, reusability and code efficiency can almost always have a positive impact on sustainability. Applications that require less compute power use less electricity, which ultimately leads to a net reduction of CO2-like emissions. It is important to take these factors into account when choosing architectural options and following green engineering best practices. The gains may be small at the level of the developer but become clearly significant when scaled-up to production levels. Quantitative measurement is essential to evidence that improvement. Sustainability has also become part of DevOps vocabulary as GreenOps, focusing on improving continuous integration and delivery from the perspective of reducing emissions. A critical part of this role is adding sustainability reports to existing dashboarding approaches, giving organisations a real-time window on that closing sustainability gap. The key is managing customer and organisational objectives throughout, and treating sustainability like a transformation programme. 


Fighting financial fraud through fusion centers

The boundaries between cybersecurity and fraud/financial crime have been blurred in recent years. Indeed, cyberattacks on financial services are often the first stage of fraud taking place. Take common attacks like phishing or account takeovers for example. Are these cyber-attacks, fraud, or both? And fraud isn’t always an immediate process; some fraudulent schemes are going on for years. Who has responsibility for what, and when? The truth is that cyber-attacks and fraud are now too closely linked to be considered separately. But many firms still have investigative fraud teams and cybersecurity teams operating independently, along with the systems and processes that support them. As a result, these teams have different levels of access to various data repositories, and do not necessarily use the same toolsets to analyze them. That data is arriving at fluctuating speeds, in multiple formats, and in huge volumes. Some firms may have to navigate a complex legacy technology environment to access that data. In short, there is no consistent context within which a unified decision can be made.


Schneider Electric CIO talks IT staffing, sustainability, and digital transformation

“All workers, including IT workers, must have a connection to their company’s mission, and ownership over what their company’s goals and values are. At Schneider Electric, it is important that IT workers understand what we do as a business, in addition to our overarching mission of creating and offering solutions to help our customers. This attitude creates awareness, as well as dedication to their role within IT and the broader company.” “As for the specific traits of these workers, one that is learned is what we call the power couple model — a domain and digital leader — when the business leaders and technology leaders complement each other by playing different roles in solutioning. The domain, or business leaders, are responsible for the 'what' and the 'why,' while the digital leaders are responsible for the 'how' and the 'when.' They do this through leveraging new technology to offer the most efficient solutions to customers and create a beneficial partnership.”


Tech purchasing decisions – how many involved?

The key to good decision-making is having the department that best understands how the technology will be used involved, but also ensuring that the leaders who really understand technology are in the room as well. That means understanding, from the beginning, who needs to be involved in the decision and ensuring they are in the conversation. For those selling the technology, whether through brand awareness or lead generation campaigns, it means realising that targeting a single decision-maker who works in the specific function where their technology is used is the wrong strategy. Yes, those selling, for example, martech need to have the marketing function on board, but they also need buy-in from IT, procurement, sales, finance and HR. ... The business world is becoming increasingly interconnected, meaning the impacts of the decisions leaders make are far more wide-reaching. Whether companies are considering their sustainability strategy, navigating supply-chain risk or purchasing a new CRM system, leaders increasingly need to understand what is happening outside their function and how it impacts them. 


Can AI solve IT’s eternal data problem?

Most enterprises today maintain a vast expanse of data stores, each one associated with its own applications and use cases—a proliferation that cloud computing has exacerbated, as business units quickly spin up cloud applications with their own data silos. Some of those data stores may be used for transactions or other operational activities, while others (mainly data warehouses) serve those engaged in analytics or business intelligence. To further complicate matters, “every organization on the planet has more than two dozen data management tools,” says Noel Yuhanna, a VP and principal analyst at Forrester Research. “None of those tools talk to each other.” These tools handle everything from data cataloging to MDM (master data management) to data governance to data observability and more. Some vendors have infused their wares with AI/ML capabilities, while others have yet to do so. At a basic level, the primary purpose of data integration is to map the schema of various data sources so that different systems can share, sync, and/or enrich data. The latter is a must-have for developing a 360-degree view of customers, for example.


Navigating Your Data Science Career

Similar to finding your passion as a data scientist, finding new opportunities to diversify skill sets and experience is extremely helpful when trying to grow your career. There are many business sectors that require data science. Many of my retail coworkers have gone on to have great careers in media, finance, supply chain, social platforms, banking, and many other industries. Having a diverse background can open many more opportunities in the future. Not only can having a robust background be more attractive to recruiters, it can also be helpful in case a market downturn occurs in a given business sector which may limit future career opportunities. Although exploring different data science fields can be beneficial, as a developing data scientist you often have many opportunities to expand your skill set within your current company. Take retail, for example; data science expertise is required in sectors such as marketing, pricing, logistics, and merchandising. Being open to new positions provides the opportunity to gain new industry knowledge and become a more valuable and well-rounded employee.



Quote for the day:

"Leadership is intangible, and therefore no weapon ever designed can replace it." -- Omar N. Bradley