Daily Tech Digest - January 19, 2024

SolarWinds VP Offers 2024 Predictions on AI

As CIOs are either in the process of implementing AI into observability efforts, or at the very beginning stages, Stewart says data hygiene and management is going to be a key factor. “One of the key components is really understanding where you’re at on that observability journey,” he says. “There are a lot of disparate tools and different observability offerings that may be very segmented … The key is having the full data set across that stack that allows the AI technology to leverage that data, because if the engines don’t have the data across the stack, then there’s going to be parts of the puzzle that are missing, and AI is just not going to be able to accommodate.” ... “IT budgets aren’t getting bigger,” Stewart says. “And in many cases, budgets are shrinking based on concerns with the economy. Folks are looking for ways to save and some of that will certainly come through automation and efficiencies. And some of that will come through tool and vendor consolidation. The ability to leverage various AI technologies is certainly something that people are interested in to realize those efficiencies.”


Beware of hidden cloud fees

Fees can complicate cost management more so when transferring data across different cloud platforms, which is typical for multicloud deployments. Also, various factors such as location, geography, and data type can significantly impact the size of these charges. Egress charges, levied for data transferred out of a cloud service provider’s network, are now a hot button, even though they’ve been a part of cloud bills for years. High egress charges can inflate operational costs and restrict organizations from transitioning between cloud providers or moving their data to more cost-effective alternatives, even back to their enterprise data centers. As one of my clients put it, they feel their data is being held for ransom. ... Of course, many are looking to the cloud providers charging these fees to fix the issues. They may not be legally obligated to remove these fees, but they are listening to cloud users and have taken steps to reduce egress fees. Many enterprises are questioning their need to be in the cloud in the first place and could move to other platforms if costs are too high. Much of the repatriation that’s occurring is purely for cost issues. All things being equal, companies would rather stay in the cloud. If enterprises could get relief from annoying fees, this could keep some companies in place on the public cloud providers.


New study urges industry to address generational division in tech skills

As artificial intelligence becomes increasingly common in industries, experts are urging companies to address the gaps to sustain organisation capability. “Technology is transforming organisations – faster and more diverse than ever. Communication, collaboration, financial savings, productivity and security are underpinning these shifts and forming the catalyst for change,” said Greg Weiss, an HR consultant, onboarding expert, and the founder of Career365. Capterra’s study identified the three primary challenges that hinder the speed of digital transformation. These include the usage gap among employees, limited access to resources or training, and the constant introduction of new tools making it difficult to adapt. The research also revealed that while millennials are naturally inclined to digital tools (87 per cent), baby boomers and Generation Z are equally drawn to new technology (85 per cent). “The appetite is definitely there. It’s a matter of how these employees are facilitated and bridging the digital generation gap is crucial. A cookie-cutter approach to training and support doesn’t work in a divergent workforce – as their alignment differs,” Weiss said.


The Case for (and Against) Monorepos on the Frontend

Monorepos aren’t just for enterprise applications and large companies like Google, Savkin said. As it stands now, though, polyrepos tend to be the most common approach, with each line of business or functionality having its own repo. Take, for example, a bank. Its website or app might have a credit card section and an auto loan section. But what if there needs to be a common message, function or even just a common design change across the divisions? Polyrepo makes that harder, he said. “Now I need to do a coordination thing with team A, team B,” Savkin said. “In a polyrepo case, it can take many months.” In a monorepo, it’s easy to make that one change in as little as a day, he added. It also enables sharing components and libraries across development teams. Monorepos helped Jotfrom, an online forms company based in San Francisco, reduce its technical debt on the frontend, according to frontend architect and engineering director Berkay Aydin. Aydin wrote last week about the company’s move to a monorepo for the frontend. “We don’t have multiple configs or build processes anymore,” Aydin wrote. “Now, we’re sure every application is using the same configurations.”


Enterprises struggle with Agile methodology, reports long-standing survey of practitioners

According to the report, Agile is most successful at small companies. “Those in small companies are more likely than those in medium and large ones to say they are satisfied [with Agile],” the report states, and “74 percent of small companies (versus 62 percent at large companies) said at least 50 percent of their applications were delivered on time and with quality”. A key problem, which will not be a surprise to developers, is that “the business side is very slow to embrace Agile. Almost half of survey takers pointed to a generalized resistance to organizational change or culture class as the reasons why the business side is not embracing Agile, up 7 points from 2022.” ... Scrum is a specific Agile methodology and used by 63 percent of Agile teams, according to the report, which also states that Scrum has been the most popular Agile methodology since 2006 when the survey was first conducted. That said, even Scrum has many variants and the survey states that “the Agile landscape continues to be very fragmented.” 22 percent of survey respondents said that “we don’t follow a mandated framework” and 12 percent that “we created our own enterprise Agile framework.”


The OWASP AI Exchange: an open-source cybersecurity guide to AI components

In the context of AI systems, OWASP’s AI Exchange discusses development-time threats in relation to the development environment used for data and model engineering outside of the regular applications development scope. This includes activities such as collecting, storing, and preparing data and models and protecting against attacks such as data leaks, poisoning and supply chain attacks. Specific controls cited include development data protection and using methods such as encrypting data-at-rest, implementing access control to data, including least privileged access, and implementing operational controls to protect the security and integrity of stored data. Additional controls include development security for the systems involved, including the people, processes, and technologies involved. This includes implementing controls such as personnel security for developers and protecting source code and configurations of development environments, as well as their endpoints through mechanisms such as virus scanning and vulnerability management, as in traditional application security practices. Compromises of development endpoints could lead to impacts to development environments and associated training data.


NIST Offers Guidance on Measuring and Improving Your Company’s Cybersecurity Program

The publication is designed to be used together with any risk management framework, such as NIST’s Cybersecurity Framework or Risk Management Framework. It is intended to help organizations move from general statements about risk level toward a more coherent picture founded on hard data. “Everyone manages risk, but many organizations tend to use qualitative descriptions of their risk level, using ideas like stoplight colors or five-point scales,” said NIST’s Katherine Schroeder, one of the publication’s authors. “Our goal is to help people communicate with data instead of vague concepts.” Achieving this goal, according to the authors, involves moving from qualitative descriptions of risk — perhaps using broad categories such as high, medium or low risk level — to quantitative ones that carry less ambiguity and subjectivity. An example of the latter would be a statement that 98% of authorized system user accounts belong to current employees and 2% belong to former employees. The team developed the new draft guidance partly in response to public requests and feedback from a pre-draft call for comment. 


What is credential stuffing and how can I protect myself? A cybersecurity researcher explains

Credential stuffing is a type of cyber attack where hackers use stolen usernames and passwords to gain unauthorised access to other online accounts. In other words, they steal a set of login details for one site, and try it on another site to see if it works there too. This is possible because many people use the same username and password combination across multiple websites. It is common for people to use the same password for multiple accounts (even though this is very risky). Some even use the same password for all their accounts. This means if one account is compromised, hackers can potentially access many (or all) their other accounts with the same credentials. ... The best way is to never reuse passwords across multiple sites or apps. Always use a unique and strong password for each online account. Choose a password or pass phrase that is at least 12 characters long, is complex, and hard to guess. It should include a mix of uppercase and lowercase letters, numbers, and symbols. Don’t use pet names, birthdays or anything else that can be found on social media. You can use a password manager to generate unique passwords for all your accounts and store them securely. 


54% data fiduciaries lack experience in enforcing data protection laws

The research findings are based on the provisions of India's Digital Personal Data Protection (DPDP) Act that was enacted in August 2023. The rules for the Act are yet to be released for public consultation. The findings are part of the research carried out by the think tank Esya Centre in a report called "An Empirical Evaluation of the Implementation Challenges of the Digital Personal Data Protection Act 2023: Insights and Recommendations for the Way Forward." It has involved insights from 16 industry stakeholders, of which 13 are data fiduciaries and three are experts. "India has come a long way from the early iterations of the Data Protection Bill to the enactment of the Digital Personal Data Protection Act, 2023. The decision to eschew localization requirements and a compliance-heavy framework heralds a commitment to a progressive framework. It is now time to ensure that the prospective rules maintain the forward-thinking approach underpinning the parent Act and preserve a compliance-light data protection regime in the country," said Meghna Bal, Head of Research, Esya Centre.


Navigating The 'Fog Of A Cyberattack': Critical Lessons In Governance From The SEC Cybersecurity Rule

The short breach notification timeline attached to the SEC’s new cybersecurity disclosure rule is loud and clear: C-Suite leaders and boards have important work to do in ensuring their organizations can quickly identify, understand and publicly disclose material cybersecurity events and impacts. In this case, the expression “fog of war” is a useful analogy for understanding a critical complication. The term recognizes that many factors on which action in war is based are “wrapped in a fog of greater or lesser uncertainty.” The fog of a cyber event will similarly make the four-business-day timeline incredibly challenging. ... Instead of making battle plans mid-crisis, prepare now, establishing how incidents are identified, how reports get written and who’s responsible for determining materiality. Create rough boundaries for evaluating materiality (e.g., questions to ask, example incidents) to make decisions as clear as possible. Incomplete information is better than no information. You may not have a complete picture to share publicly, and that’s okay. But when you do your initial disclosure, establish when your next update will be shipped. 



Quote for the day:

“The more you loose yourself in something bigger than yourself, the more energy you will have.” -- Norman Vincent Peale

Daily Tech Digest - January 18, 2024

A tougher balancing act in 2024, the year of the CISO

What’s making things more difficult for CISOs? The ESG/ISSA data indicates that business aspects of running a cybersecurity program like working with the board, overseeing regulatory compliance, and managing a budget are primary contributing factors. This makes sense as the CISO role has evolved from technical overseer to business executive over the past few years. At the same time, organizations have increased their dependence on IT for automation, optimization, customer service, and digital transformation. ... Like their non-CISOs colleagues, CISOs are particularly stressed by things like an overwhelming workload, working with disinterested business managers, and keeping up with the security requirements of new business initiatives. It’s worth noting that 26% of CISOs are also stressed about monitoring the security status of third parties their organization does business with as compared with 12% of non-CISOs. Third-party relationships are often associated with business processes and therefore tied closely with business units. Unfortunately, security teams probably don’t have deep visibility into the day-to-day security performance at these firms. 


How do agile and DevOps interrelate?

Agile and DevOps have much in common. In fact, DevOps grew as an offshoot (or improvement) of agile, as many industry leaders found dysfunction in IT and software development. While incremental improvements support quality products, the competing objectives of individual IT workers lowered overall performance. To remedy the problem, developers proposed the more integrated approach of DevOps. Of course, the new philosophy offered different core values, which caused a split between two opposing communities. Developers grappled with what looked like conflicting philosophies, leading to the most common misconception about agile and DevOps: that they don’t interrelate. On the surface, the pundits have much to draw on. DevOps engineers focus on software scalability, speed, and team integration. Agile focuses on the slower, iterative process of software development, with more emphasis on continuous testing. More importantly, Agile silos individuals while DevOps integrates. Without the operability of DevOps, infrastructure responsibility falls to the wayside. But without the basic building blocks of the incremental, customer-focused method of agile, DevOps has no fundamental processes on which to stand.


AI Fraud Act Could Outlaw Parodies, Political Cartoons, and More

So just how broad is this bill? For starters, it applies to the voices and depictions of all human beings "living or dead." And it defines digital depiction as any "replica, imitation, or approximation of the likeness of an individual that is created or altered in whole or part using digital technology." Likeness means any "actual or simulated image… regardless of the means of creation, that is readily identifiable as the individual." Digital voice replica is defined as any "audio rendering that is created or altered in whole or part using digital technology and is fixed in a sound recording or audiovisual work which includes replications, imitations, or approximations of an individual that the individual did not actually perform." This includes "the actual voice or a simulation of the voice of an individual, whether recorded or generated by computer, artificial intelligence, algorithm, or other digital means, technology, service, or device." These definitions go way beyond using AI to create a fraudulent ad endorsement or musical recording. They're broad enough to include reenactments in a true-crime show, a parody TikTok account, or depictions of a historical figure in a movie.


Navigating digital transformation in insurance sector: Challenges, opportunities and innovations

Notable developments are the changes that regulators have come up with in cybersecurity, the Information Security Management, the connectivity management with the website, with the vendors and employees, and the digital transformation that they have pushed us to. Because today, everything is digitally handled, an employee actually meets a customer, and the customer fills the form digitally; there is no mechanical filling of forms, although that practice is still there in many parts of the country and in many companies also. Having said that the digital absorption has become higher in percentage. So, when we handle things digitally, you will have to think through, and therefore today, employees are forced to think through what are the controls they could have. Like we have introduced OTP, so each state’s customer is forced to think and answer questions on OTP. You have to give OTP for that, like a policy that I bought about last week, so I had to do six OTPs in that company. I was wondering why so many OTPs are required, but when I look at the way the processes were handled by the salesperson, it was quite effective and efficient, and at the same time, it’s all for the safety of the customer, that thought process is given to the customer.


Productivity Paradox: Productivity in the Age of Knowledge Work

We sometimes forget that every employee within an enterprise is, at their core, also a consumer. Their personal preferences, shaped by daily interactions with personal technology, inevitably spill over into their professional lives. Consequently, the ubiquity of Macs, iPhones and iPads in the consumer market has sparked a growing demand for these devices in the workplace. This shift has only been hastened by the “Bring Your Own Device” (BYOD) movement, wherein employees sought to use their trusted personal devices for professional tasks, yearning for the familiarity and ease of use they’ve grown accustomed to. Instead of resisting this tide, more and more forward-thinking enterprises are instead leaning in. For one, IT leaders have recognized that specific hardware platforms matter less these days as they shift more of their applications to the public cloud. Reliability is another major factor, especially for remote employees who don’t have an IT helpdesk at their disposal. And when our survey respondents were asked if they agreed or disagreed with the statement “Apple takes enterprise security, compliance, and privacy concerns more seriously than other vendors,” three-quarters of them concurred.


6 hot networking and data center skills for 2024

“The current rage in AI technology is more than just a fad,” Leary says. “It is delivering real measurable benefits to IT organizations and the businesses they serve. And there is no sign of slowdown on the horizon. Driven by pressing capacities and costs, physical data center designs are changing significantly” with generative AI. Organizations need IT staffers who can help assure that generative AI is provided the data, data processing, and data exchanges needed to deliver on its promise, Leary says. ... Knowledge about cyber security products and services—as well as the threats they guard against—never go out of fashion. Organizations are facing a barrage of threats against their networks and data centers, so finding people with related skills remains a high priority. “Companies are constantly having to pay attention to their security as more and more cyber attacks happen,” Vick says. “That is something that is not slowing down, so they are having to update their firewalls and other security features.” Enterprises are building out their security teams, in some cases looking for people to update their security posture with a variety of technologies.


Navigating data management modernisation to deliver the AI-ready tech stack of the future

Forward-thinking IT leaders already see a direct correlation between modernising the data management journey across the entire tech stack and facilitating the extraction of value from data at the speed and scale needed for real-time intelligence. The same Alteryx research suggests that digital transformation relating to AI and machine learning will be the number one characteristic of the future enterprise, and tech stack priorities are already shifting to reflect this. Generative AI, quantum computing and machine learning operations (MLOps) are cited as the technologies most likely to see the largest shift in accelerated adoption in the future. While it only seems a short time since the pandemic forced many organisations to accelerate transformation at breakneck speeds, the rise of AI-related technologies will reinfuse these transformations. Why? Because AI has lowered the barrier to delivering productivity gains by delivering data-driven insights with just a sentence or a prompt. With countless data-oriented AI technologies and intelligent systems already available, the ultimate goal of this transformation is to modernise the data management journey across the entire stack.


Continuous Quality Assurance: Strategizing Automated Regression Testing for Codebase Resilience

In times of QA software testing, automation regression processes can be enabled to autonomously identify any unexpected behaviors or regressions in the software. ... End-users anticipate a consistent and dependable performance from software, recognizing that any disruptions or failures can profoundly affect their productivity and overall user experience. The implementation of regression testing proves invaluable in identifying unintended consequences, validating bug fixes, upholding consistency across versions, and securing the success of continuous deployment. Through early identification and resolution of regressions, development teams can proactively safeguard against issues reaching end-users, thereby preserving the quality and reliability of their software. ... Automated regression testing can be strategized based on the complexity of the codebase for approaches like retesting everything, selective re-testing, and prioritized re-testing. Tools such as Functionize, Selenium, Watir, Sahi Pro, and IBM Rational Functional Tester can be used to automate regression testing and improve efficiency.


Sustainable Partnerships Pay Dividends

The first step in establishing highly effective partnerships, von Koeller says, is to identify what an organization hopes to achieve and establish a roadmap for meeting specific goals and metrics. This requires a focus on shared values and coordination across departments and groups. “You have to understand your footprint, understand what environmental impact an action has, and how to engage suppliers to achieve alignment around your targets,” she explains. Open and honest communication among partners is vital. Too often, larger companies fail to understand what suppliers can do and what they can’t do, particularly when they’re located in faraway countries, or a supply chain has numerous layers. Partners downstream and upstream face their own set of challenges -- environmental, political, and practical -- that can make it difficult to conform to strict standards. A particularly daunting aspect -- especially for smaller firms supplying raw materials or specialized components -- is onerous data collection and reporting requirements, Linich notes. As a result, smaller companies may require funding and technical assistance from larger partners, including aid in setting up software and IT systems that support sustainability.


Get started with Anaconda Python

The Anaconda distribution is a repackaging of Python aimed at developers who use Python for data science. It provides a management GUI, a slew of scientifically oriented work environments, and tools to simplify the process of using Python for data crunching. It can also be used as a general replacement for the standard Python distribution, but only if you’re conscious of how and why it differs from the stock version of Python. ... The most noticeable thing Anaconda adds to the experience of working with Python is a GUI, the Anaconda Navigator. It is not an IDE, and it doesn’t try to be one, because most Python-aware IDEs can register and use the Anaconda Python runtime themselves. Instead, the Navigator is an organizational system for the larger pieces in Anaconda. With the Navigator, you can add and launch high-level applications like RStudio or Jupyterlab; manage virtual environments and packages; set up “projects” as a way to manage work in Anaconda; and perform various administrative functions. Although the Navigator provides the convenience of a GUI, it doesn’t replace any command-line functionality in Anaconda, or in Python generally. 



Quote for the day:

"Leaders are more powerful role models when they learn than when they teach." -- Rosabeth Moss Kantor

Daily Tech Digest - January 17, 2024

Improving Supply Chain Security, Resiliency

Regulatory compliance plays a vital role in how cybersecurity strategies are built: Compliance mandates like GDPR and the NIST Cybersecurity Framework provide foundations for data protection, access control, and incident response. “With these baselines in place, organizations can ensure that there is a certain level of security across all supply chain partners, which reduces the overall risk landscape,” Bachwani says. “Compliance also fosters a culture of security, which drives continuous improvement.” He adds that the pressure to meet regulatory standards necessitates ongoing risk assessments, proactive risk management practices, and regular vulnerability patching, which prioritizes cybersecurity in decision-making. “Regulatory frameworks often come with heavy fines and reputational damage for those who do not comply,” Bachwani notes. “This incentivizes everyone within the supply chain to prioritize cybersecurity and invest in robust safeguards.” Christopher Warner, senior security consultant at GuidePoint Security, says regulatory frameworks often specify security controls and standards that organizations must follow.


Quantum entanglement discovery is a revolutionary step forward

This discovery opens the door to new quantum communication protocols, utilizing topology as a medium for quantum information processing. Such protocols could revolutionize how we encode and transmit information in quantum systems, especially in scenarios where traditional encoding methods fail due to minimal entanglement. In summary, the significance of this research lies in its potential for practical applications. For decades, preserving entangled states has been a major challenge. The team’s findings suggest that topology can remain intact even as entanglement decays, offering a novel encoding mechanism for quantum systems. Professor Forbes concludes with a forward-looking statement, saying, “We are now poised to define new protocols and explore the vast landscape of topological nonlocal quantum states, potentially revolutionizing how we approach quantum communication and information processing.” ... It’s a physical process where pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently of the state of the others, even when the particles are separated by a large distance.


Staffing levels: are data centers at risk of unnecessary outages?

As for whether there were sufficient staff onsite during the Microsoft outage, and what should be the optimal number of staff present, John Booth, Managing Director of Carbon3IT Ltd, and Chair of the Energy Efficiency Group of the Data Centre Alliance, says it very much depends on the design and scale of the data center, as well as on the level of automation for monitoring and maintenance. Data centers are also often reliant on outsourced personnel for specific maintenance and emergency tasks and offer a 4-hour response. Beyond this, he suggests there is a need for more information to determine whether 7 staff were sufficient but admits that 3 members of staff are usually the norm for a night shift, “with perhaps more during the day depending on the rate of churn of equipment.” Davis adds that there is no reliable rule of thumb because each and every organization and site is different. However, there are generally accepted staff calculation techniques that can determine the right staffing levels for a particular data center site. As for the Microsoft incident, he’d need to formally do the calculations to decide whether 3 or 7 technicians were sufficient. It’s otherwise just a guess.


Projecting 2024 Cybertrends and C-Suite Responsibilities

Organizations must comply with various regulations and standards, such as the EU General Data Protection Regulation (GDPR), the US State of California Consumer Privacy Act (CCPA), the Payment Card Industry Data Security Standard (PCI DSS), and the US Health Insurance Portability and Accountability Act (HIPAA). Non-compliance can result in fines, legal action, or reputational damage. Compliance can be achieved if C-suite executives establish a compliance framework that requires them to assess and monitor their compliance status and implement necessary policies and procedures. They should also stay up to date on the changing regulatory and compliance landscape and engage with regulators and policymakers.The persistent cybersecurity skills gap is the shortage of qualified and experienced cybersecurity professionals on the job market. The cybersecurity skills gap can affect the ability of organizations to prevent, detect, and respond to cyberthreats. To help fill the skills gap, C-level executives should invest in the recruitment, retention, and development of their cybersecurity talent, and offer competitive compensation and benefits.


Here’s what you should look for in an OKR Management Tool

Communication is central to ensuring the success of any goal-setting framework. Make sure the technology you are leveraging allows the capturing of feedback, thoughts and comments on an ongoing basis. Using Keka’s OKR tool, teams can engage in meaningful discussions, share insights, and offer feedback directly on objectives and key results, fostering a culture of transparency and continuous improvement via the comments and 1 on 1 meeting feature. This functionality also empowers teams to set their own aligned goals, tailoring objectives to their unique strengths and challenges while still contributing to the larger organisational mission. ... Reminders about OKRs are highly advantageous as they keep objectives and key results at the forefront of individuals' and teams' attention, minimising the risk of goals becoming overlooked or forgotten during daily tasks. These reminders serve as nudges, encouraging consistent progress tracking, timely updates, and proactive adjustments. By maintaining goal visibility and urgency, this feature ensures that teams stay on track, deadlines are met, and alignment with broader strategic objectives remains strong, ultimately driving improved goal achievement and organisational success.


The CISO’s guide to accelerating quantum-safe readiness

With a dynamic perspective of their enterprise-wide cryptographic usage, CISOs can begin the work of cybersecurity risk assessments. This step involves working with cybersecurity and privacy managers to prioritize sensitive and critical data sets most at risk from “harvest now, decrypt later” attacks and with the highest business value and impact. To translate these insights into a quantum-safe strategy, security leaders should evaluate the business relevance in relation to the complexity of mitigation for specific assets so that they can plan their quantum-safe transition in a way that optimizes performance, compatibility and ease of integration. ... The final step in the journey to quantum-safe security is the transformation of cryptographic infrastructure to incorporate quantum-resistant cryptography. Before deploying quantum-safe solutions to their stack, security leaders should equip their teams with the tools and education to test the new cryptographic protocols and evaluate the potential impact on systems and performance. Quantum-safe solutions that can be updated without having to overhaul their cybersecurity infrastructure will help CISOs establish crypto-agility and ensure they can proactively and seamlessly address potential quantum vulnerabilities.


Magic Keyboard vulnerability allows takeover of iOS, Android, Linux, and MacOS devices

“The user does not have to have a keyboard paired with their phone already. And as long as Bluetooth is enabled on the Android device, at any time the phone is on them, and Bluetooth is on, the attacker can then force pair an emulated keyboard with the Android device and inject keystrokes, including at the lock screen.” Newlin then turned to Linux. “It turns out that the Linux attack is very, very similar,” he said. “On Linux, as long as the host is discoverable and connectable over Bluetooth, the attacker can force-pair a keyboard and inject keystrokes without the user’s confirmation. And so, this is distinct from Android in that the device has to be not only connectable but also discoverable and connectable on Linux for the attack.” Linux fixed this bug in 2020 but left the fix disabled by default. ... Newlin encourages security researchers to continue probing Bluetooth flaws. “I think it’ll probably be a while [before the full extent of Bluetooth flaws is known] because it will take the community actually fleshing these out and identifying all these additional effective systems beyond what I’ve seen myself,” he said.


How Edge Analytics Can Deliver the Competitive Edge Your Business Needs

Traditional data analytics models struggle to keep up with all the data that’s being generated. Traditional data analytics is also no match for today’s data velocity. As the speed at which data is created continues to grow, there will be an even greater need for real-time processing. The interpretation and application of real-time analytics can vary based on the specific industry and its requirements. Real-time analytics is a broad concept that is adapted to suit the needs of different industries and sectors. ...  By addressing these traditional data analytics challenges, edge analytics is becoming more prominent. It’s a natural progression -- taking data and business where they need to go now. ... Businesses can move faster with edge analytics because of its reduced latency. This is possible because edge analytics processes data closer to where it was generated, so organizations get data insights quicker. Reduced latency is particularly critical for applications that require real-time response such as battlefield scenarios, fraud detection, and supply chain management. Because edge analytics reduces the data load on the network, it also saves energy, reduces carbon emissions, and helps organizations meet their sustainability goals to protect the planet.


How OpenAI plans to handle genAI election fears

For its part, OpenAI said ChatGPT will redirect users to CanIVote.org for specific election-related queries. The company is also focusing on enhancing the transparency of AI-generated images using its DALL-E technology with plans to incorporate a "cr" icon on such photos, signaling they are AI-generated. The company also plans to enhance its ChatGPT platform by integrating it with real-time global news reporting, including proper attribution and links. The news initiative is an expansion of an agreement made last year with the German media conglomerate Axel Springer. Under that deal, ChatGPT users gain access to summarized versions of select global news content from Axel Springer's various media channels. ... There's no universal rule for how genAI should be used in politics. Last year, Meta declared it would prohibit political campaigns from using genAI tools in their advertising and mandate that politicians reveal any such use in their ads. Similarly, YouTube said all content creators must disclose whether their videos contain "realistic" but altered media, including those created with AI.


Storytelling for CIOs: From niche to bestseller

“For a CIO, or anyone in a senior position with responsibility for data, the best way to succeed is to make projects come to life,” says Caroline Carruthers, formerly a pioneering chief data officer at Network Rail, which manages train stations and infrastructure in the UK, and now CEO of data consultancy Carruthers and Jackson. “You can give people all the dashboards, charts and figures in the world, but it’s when you help them understand the thinking behind what you do and bring it to life that you get the buy-in you need.” Often, CIOs use stories as a form of Esperanto or a translation layer. “I always find there’s benefit in using a story to help my audience understand what can sometimes be very technical concepts that I’m trying to communicate to non-technical people,” says Adam Miller, CIO of UK insurer, Markerstudy Group. “Get the story right, then people understand the plan and you’ve a much better chance of them buying in. I also find that a good story is just as important for highlighting the impact of inaction too, which can often be the easiest option for people to take.”



Quote for the day:

"Leadership Seductions are behaviors or attitudes in which we become 'stuck'" -- Catherine Robinson-Walker

Daily Tech Digest - January 16, 2024

Why Pre-Skilling, Not Reskilling, Is The Secret To Better Employment Pipelines

In a landscape where the relevance of skills evolves, Zaslavski says that organizations should focus on selecting and advancing individuals based on their potential for learning skills like critical thinking and resiliency, instead of focusing on hard skills like coding. ... “By concentrating on these fundamental elements, as opposed to current technical proficiency or past work history, organizations position themselves with an agile and future-ready workforce. In this light, pre-skilling should be an integral part of employers’ talent strategy pre and post-hiring, from sourcing and recruiting to career pathing and employee engagement.” ... She points to areas like understanding if a potential or existing employee has the EQ and social skills needed to perform as part of a group. Or whether they have the curiosity and analytical intelligence needed to learn new hard skills as well as the ambition and work ethic to achieve results. “When people have learning ability, drive, and people skills, they will probably develop new skills faster than others,” she says.


Agile is a concept we all continuously talk about, but what is it really?

Empiricism, teams, user stories, iterations; they are all examples of tools that we use in Agile, but they are not its purpose. Agile is about empowering people to take control of their environment and give them complete freedom to discover how to use available tools in the most effective way. And this applies to the why too. People adopt Agile to increase efficiency, transparency, velocity, predictability, quality. But again all these are a result of Agile, not its goal. It is the mindset that makes it all possible. That is why it is “People and interactions above processes and tools”. To illustrate this, think about empiricism itself. Try introducing empiricism into an organisation mired in a culture of fear and control, and it doesn’t work, no matter what you do. You can’t force empiricism. People are too busy evading blame and manipulating information. Think about it, how often do people complain that the retrospective doesn’t deliver anything? Retrospectives where people just complain and nothing changes? 


What Will It Take to Adopt Secure by Design Principles?

What does the future of secure by design adoption look like? CISA is continuing its work alongside industry partners. “Part of our strategy is to collect data on attacks and understand what that data is telling us about risk and impact and derive further best practices and work with companies, and really other nations, to adopt these principles,” Zabierek shares. International collaboration on secure by design is reflected not only in this CISA initiative but also the Guidelines for Secure AI System Development. CISA and the UK’s National Cyber Security Centre (NCSC) led the development of those guidelines, and 16 other countries have agreed to them. But like the Secure by Design initiative, this framework is also non-binding. A software manufacturer’s timeline for adopting secure by design principles will depend on its appetite, resources and the complexity of its products. But the more demand from government and consumers, the more likely adoption will happen. Right now, CISA has no plans to track adoption. “We're more focused on collaborating with industry so that we can understand best practices and recommend further better guidelines,” says Zabierek.


Mastering the art of motivation

Once you’ve helped employees connect their dots, the best way to further motivate them is also the cheapest, easiest, and has the fewest unintended consequences. Compliment them on a job well done, whenever they’ve done a job well enough to be worth noting. Sure, there are wrong ways to use compliments as motivators. First and foremost the employee you’re complimenting must value your opinion. If they don’t they’ll write off your compliment as just so much noise. Second, a compliment from you should not be an easy compliment to earn. “I really like your belt,” isn’t going to inspire someone to work inventively and late. Third, with few exceptions compliments should be public. There’s little reason for you to be embarrassed about being pleased with someone’s efforts. With one caveat: Usually you’ll have one or two in your organization who routinely perform exceptionally well, but also one or two who are plodders — good enough and steady enough to keep around; not good enough or steady enough to earn your praise. Find a way to compliment them in public anyway — perhaps because you prize their reliability and lack of temperament.


Do you need GPUs for generative AI systems?

GPUs greatly enhance performance, but they do so at a significant cost. Also, for those of you tracking carbon points, GPUs consume notable amounts of electricity and generate considerable heat. Do the performance gains justify the cost? CPUs are the most common type of processors in computers. They are everywhere, including in whatever you’re using to read this article. CPUs can perform a wide variety of tasks, and they have a smaller number of cores compared to GPUs. However, they have sophisticated control units and can execute a wide range of instructions. This versatility means they can handle AI workloads, such as use cases that need to leverage any kind of AI, including generative AI. CPUs can prototype new neural network architectures or test algorithms. They can be adequate for running smaller or less complex models. This is what many businesses are building right now (and will be for some time) and CPUs are sufficient for the use cases I’m currently hearing about. CPUs are more cost-effective in terms of initial investment and power consumption for smaller organizations or individuals who have limited resources. 


How to create an AI team and train your other workers

Building an genAI team requires a holistic approach, according to Jayaprakash Nair head of Machine Learning, AI and Visualization at Altimetrik, a digital engineering services provider. To reduce the risk of failure, organizations should begin by setting the foundation for quality data, establish “a single source of truth strategy,” and define business objectives. Building a team that includes diverse roles such as data scientists, machine learning engineers, data engineers, domain experts, project managers, and ethicists/legal advisors is also critical, he said. “Each role will contribute unique expertise and perspectives, which is essential for effective and responsible implementation,” Nair said. "Management must work to foster collaboration among these roles, help align each function with business goals, and also incorporate ethical and legal guidance to ensure that projects adhere to industry guidelines and regulations." ... It's also important to look for people who like learning new technology, have a good business sense, and understand how the technology can benefit the company.


Data is the missing piece of the AI puzzle. Here's how to fill the gap

Companies looking to make progress in AI, says Labovich, must "strike a balance and acknowledge the significant role of unstructured data in the advancement of gen AI." Sharma agrees with these sentiments: "It is not necessarily true that organizations must use gen AI on top of structured data to solve highly complex problems. Oftentimes the simplest applications can lead to the greatest savings in terms of efficiency." The wide variety of data that AI requires can be a vexing piece of the puzzle. For example, data at the edge is becoming a major source for large language models and repositories. "There will be significant growth of data at the edge as AI continues to evolve and organizations continue to innovate around their digital transformation to grow revenue and profits," says Bruce Kornfeld, chief marketing and product officer at StorMagic. Currently, he continues, "there is too much data in too many different formats, which is causing an influx of internal strife as companies struggle to determine what is business-critical versus what can be archived or removed from their data sets."


3 ways to combat rising OAuth SaaS attacks

At their core, OAuth integrations are cloud apps that can access data on behalf of a user, with a defined permission set. When a Microsoft 365 user installs a MailMerge app to their Word, for example, they have essentially created a service principal for the app and granted it an extensive permission set with read/write access, the ability to save and delete files, as well as the ability to access multiple documents to facilitate the mail merge. The organization needs to implement an application control process for OAuth apps and determine if the application, like in the example above, is approved or not. ... Security teams should view user security through two separate lenses. The first is the way they access the applications. Apps should be configured to require multi-factor authentication (MFA) and single sign-on (SSO). ... Automated tools should scan the logs and report whenever an OAuth-integrated application is acting suspiciously. For example, applications that display unusual access patterns or geographical abnormalities should be regarded as suspicious. 


Cloud cost optimisation: Strategies for managing cloud expenses and maximising ROI

Instead of employing manual resources, streamlining cloud optimisation through automation could bring enhanced resource savings to the table. The auto-scaling program offered by Amazon Web Services (AWS) is a shining example of how firms can effectively streamline their cloud optimisation in a short time. The program also enables swift optimisation in response to the changing resource requirements of systems and servers. ... At the planning stage, firms need to justify the cloud budget and ensure that unexpected spending is reduced to the minimum. The same approach has to be followed in the building, deployment, and control phases so that any unexpected rise in budgets can be adjusted promptly without throwing the entire financial control into a tizzy. All these steps will help organisations develop a culture of cost-conscious cloud adoption and help them perform optimally while keeping costs in check. ... Incorporating cloud cost optimisation tools is a strategic approach for organisations to streamline expenditures and enhance ROI. 


Pull Requests and Tech Debt

The biggest disadvantage of pull requests is understanding the context of the change, technical or business context: you see what has changed without necessarily explaining why the change occurred. Almost universally, engineers review pull requests in the browser and do their best to understand what’s happening, relying on their understanding of tech stack, architecture, business domains, etc. While some have the background necessary to mentally grasp the overall impact of the change, for others, it’s guesswork, assumptions, and leaps of faith….which only gets worse as the complexity and size of the pull request increases. [Recently a friend said he reviewed all pull requests in his IDE, greatly surprising me: first I’ve heard of such diligence. While noble, that thoroughness becomes a substantial time commitment unless that’s your primary responsibility. Only when absolutely necessary do I do this. Not sure how he pulls it off!] Other than those good samaritans, mostly what you’re doing is static code analysis: within the change in front of you, what has changed, and does it make sense? You can look for similar changes, emerging patterns that might drive refactoring, best practices, or others doing similar.



Quote for the day:

"All leadership takes place through the communication of ideas to the minds of others." -- Charles Cooley

Daily Tech Digest - January 15, 2024

Authentication is more complicated than ever

Even if posture is improved and stronger forms of MFA are invoked at login, attackers will constantly be looking for new holes to exploit. Therefore, it's important to put in place detection logic and checks for compromise. Ideally, detections should target known attack techniques, but also leverage ML/AI algorithms to detect anomalous or novel suspicious behavior. For example, knowing historical access patterns can highlight when credentials suddenly attempt access from a new device or location. Put differently, authentication can no longer be only about authentication. The decision to validate a credential must be more than a question of the right password and MFA. It must include the context and conditions of the request, checked and confirmed by policy each time. When identity-based attacks are detected, automated responses should be invoked. This can mean stepping up authentication requirements, revoking access, quarantining an identity until the situation is resolved, or executing more complex responses.


The Importance of Human-centered AI

Creating a functional and reliable AI requires a combination of domain and data science expertise with design acumen.Domain experts are particularly important when developing AI for the legal sector, as legal operations professionals, attorneys, and others bring highly valuable knowledge when training AI to deliver results for corporate legal departments (CLDs). Data scientists cleanse, analyze, and glean insights from large amounts of data. AI design strategists create systems, design prototypes, and assist in model building, all while focusing on delivering intelligence in a user-centric way. It’s impossible for an AI model to work optimally without all these individuals working together. For instance, a model built just by data scientists might technically work, but it probably won’t be focused on the user or their business needs. Meanwhile, a model created by an AI designer may not have the breadth of insights it could have if a data scientist and domain expert were also involved. It’s this diversity of human talent and perspectives that lays the initial groundwork for everything that organizations want in AI.


Green data centers: efforts to push sustainable IT developments

Modular designs reduce the need for significant infrastructure modifications by enabling the gradual development of data centre capacity. In addition to saving energy, using more energy-efficient servers, storage units, and networking hardware can provide greater scalability by lowering the requirement for extra power and cooling infrastructure. The data centre’s demand for cooling increases with its size and new technologies are adding to better efficiency and energy savings. Along with this, scaling up without consuming more energy is possible with the use of effective cooling techniques like liquid cooling. Optimising resource utilization and maximising scalability may be achieved by putting into practice effective data centre management techniques like load balancing and resource sharing. Server virtualization maximizes efficiency internally, lowering the requirement for physical equipment and energy usage. Real-time monitoring and modification of energy use is made possible by artificial intelligence and machine learning, which makes infrastructure more adaptable and efficient. 


Unravelling the Persistence of Legacy Malware: By Shailendra Shyam Sahasrabudhe

While the term “legacy” may evoke images of outdated systems and forgotten technologies, in the realm of cyber threats, it takes on a more sinister connotation. Legacy malware, often several years old, continues to haunt organizations, primarily due to the shrewd tactics employed by threat actors. Global organizations face a substantial threat due to the lax enforcement of security standards for IoT device manufacturers, exacerbated by the widespread presence of shadow IoT devices within enterprise networks. This significant risk is posed by the targeting of “unmanaged and unpatched” devices by threat actors, who often leverage these vulnerabilities to establish an initial foothold in the targeted environment. These threat actors, operating as de facto businesses, harbour a vested financial interest in extending the shelf life of their malware. This involves the recycling and repackaging of malicious code, coupled with innovative market strategies. Technical manoeuvres such as code recompilation, binary morphing, and the creation of fresh signatures to sidestep traditional antivirus defences are par for the course.


The 3 Paradoxes of Cloud Native Platform Engineering

Given the plethora of DevOps tools on the market, assembling the optimal toolchain can slow everyone down and lead to inconsistent results. The solution: ensure platform engineering teams build an IDP that includes the best set of tools for the tasks at hand. The goal of such a platform is to provide a “golden path” for developers to follow, essentially a recommended set of tools and processes for getting their work done. However, this golden path can become a straitjacket. When this golden path is overly normative, developers will move away from it to get their jobs done, defeating its purpose. As with measuring their productivity, developers want to be able to make their own choices regarding how they go about crafting software. As a result, platform engineers must be especially careful when building IDPs for cloud native development. Jumping to the conclusion that tools and practices that were suitable for other architectural approaches are also appropriate for cloud native can be a big mistake. 


Cloud Computing's Role in Transforming AML and KYC Operations

The biggest advantage is data centralization. Data is not scattered in different systems which allows compliance investigators to get a holistic view of information about a customer in one place and thereby speed the investigation process and decision-making. Cloud platforms allow for seamless storage at very low cost and also enable organizations with a lot more querying and analytical toolsets. This further aids in the compliance investigation process as the AML investigator gets a view of all the transactions and the trends analysis much faster. AML platform providers were also coaxed to shift from typical on-premise solutions to creating cloud-based platforms which could then be mere plug-and-play SaaS solutions for the FIs. These enabled real-time monitoring of transactions thus alerting of any suspicious activity almost immediately. Unified AML platforms on the cloud also allow collaboration across the AML process chain and the overall FI ecosystem. 


15 ways to grow as an IT leader in 2024

Di Maria says having a group of trusted advisors can help CIOs — or any professional — identify and correct deficits as well as hone and build up strengths. She advises CIOs to tap several executives from outside their current organization, including those from other functional areas and industries, so that CIOs can gain from their diverse experiences and perspectives. ... Di Maria also recommends CIOs create an executive brand this year, if they haven’t done so already. “This helps you be a better leader and help you advance, because it has you focus on what you stand for,” she explains. “It helps you focus on how you show up and what you do so you’re more effective in your job. It helps you figure out what you should be doing, what your priorities are, and how what you’re doing provides value in your workplace.” ... As tech leaders, CIOs are instrumental in leading people through that change — and they must be better at it than they’ve been in the past, says Jason Pyle, president and managing director of Harvey Nash US and Canada, an IT recruitment and consultancy firm. “It will come down to navigating all the human elements,” he says.


Flipping the BEC funnel: Phishing in the age of GenAI

Unfortunately, a significant majority of organizations appear ill-prepared to counter these emerging phishing threats. Chief among the concerns facing most organizations today is the record-high cybersecurity workforce gap, with an estimated need for an additional 4 million professionals worldwide to protect digital assets, as reported by ISC2. The same report reveals that nearly half (48%) of organizations today lack the tools and talent to respond to cyber incidents effectively. Furthermore, the ISC2 study shows that today’s cybersecurity professionals are feeling less than confident about the current threat landscape. A staggering 75% of them assert that the present threat landscape is the most formidable they’ve encountered in the past five years, and 45% anticipate that artificial intelligence (AI) will pose their greatest challenge in the next two years. This outlook underscores the urgency for organizations to fortify their cybersecurity defenses and adapt to the rapidly evolving nature of cyber threats. Our analysis found over 8 million phishing attempts successfully evaded native defenses in 2022 alone.


Eye on the Event Horizon

While multifactor authentication is crucial for securing online accounts, SMS OTP is not the most secure form of MFA. Other, more secure methods are more difficult to hack or replicate, making them a safer option for high-risk transactions. Using WhatsApp OTP as a solution to address SMS OTP security issues could be a simple but effective solution as it offers end-to-end encryption and is cheaper than SMS. Single Sign-On via Social Login is a good option for nonfinancial applications. ... It is important to choose the most secure and reliable authentication method to protect against fraud and financial losses. While hardware-based tokens are the most secure option, they can be inconvenient to carry. There are better alternatives available, such as biometric authentication, mobile authentication apps and FIDO standards. An authenticator app - a mobile application - provides an extra layer of security to your online accounts by generating time-based, one-time passwords or TOTPs. These passwords are used for two-factor authentication and help protect your accounts from unauthorized access.


5 ways QA will evaluate the impact of new generative AI testing tools

Several experts weighed in, and the consensus is that generative AI can augment QA best practices, but not replace them. “When it comes to QA, the art is in the precision and predictability of tests, which AI, with its varying responses to identical prompts, has yet to master,” says Alex Martins, VP of strategy at Katalon. “AI offers an alluring promise of increased testing productivity, but the reality is that testers face a trade-off between spending valuable time refining LLM outputs rather than executing tests. This dichotomy between the potential and practical use of AI tools underscores the need for a balanced approach that harnesses AI assistance without forgoing human expertise.” Copado’s Hannula adds, “Human creativity may still be better than AI figuring out what might break the system. Therefore, fully autonomous testing—although possible—may not yet be the most desired way.” Marko Anastasov, co-founder of Semaphore CI/CD, says, “While AI can boost developer productivity, it’s not a substitute for evaluating quality. Combining automation with strong testing practices gives us confidence that AI outputs high-quality, production-ready code.”



Quote for the day:

"Success does not consist in never making mistakes but in never making the same one a second time." --George Bernard Shaw