Daily Tech Digest - January 18, 2024

A tougher balancing act in 2024, the year of the CISO

What’s making things more difficult for CISOs? The ESG/ISSA data indicates that business aspects of running a cybersecurity program like working with the board, overseeing regulatory compliance, and managing a budget are primary contributing factors. This makes sense as the CISO role has evolved from technical overseer to business executive over the past few years. At the same time, organizations have increased their dependence on IT for automation, optimization, customer service, and digital transformation. ... Like their non-CISOs colleagues, CISOs are particularly stressed by things like an overwhelming workload, working with disinterested business managers, and keeping up with the security requirements of new business initiatives. It’s worth noting that 26% of CISOs are also stressed about monitoring the security status of third parties their organization does business with as compared with 12% of non-CISOs. Third-party relationships are often associated with business processes and therefore tied closely with business units. Unfortunately, security teams probably don’t have deep visibility into the day-to-day security performance at these firms. 


How do agile and DevOps interrelate?

Agile and DevOps have much in common. In fact, DevOps grew as an offshoot (or improvement) of agile, as many industry leaders found dysfunction in IT and software development. While incremental improvements support quality products, the competing objectives of individual IT workers lowered overall performance. To remedy the problem, developers proposed the more integrated approach of DevOps. Of course, the new philosophy offered different core values, which caused a split between two opposing communities. Developers grappled with what looked like conflicting philosophies, leading to the most common misconception about agile and DevOps: that they don’t interrelate. On the surface, the pundits have much to draw on. DevOps engineers focus on software scalability, speed, and team integration. Agile focuses on the slower, iterative process of software development, with more emphasis on continuous testing. More importantly, Agile silos individuals while DevOps integrates. Without the operability of DevOps, infrastructure responsibility falls to the wayside. But without the basic building blocks of the incremental, customer-focused method of agile, DevOps has no fundamental processes on which to stand.


AI Fraud Act Could Outlaw Parodies, Political Cartoons, and More

So just how broad is this bill? For starters, it applies to the voices and depictions of all human beings "living or dead." And it defines digital depiction as any "replica, imitation, or approximation of the likeness of an individual that is created or altered in whole or part using digital technology." Likeness means any "actual or simulated image… regardless of the means of creation, that is readily identifiable as the individual." Digital voice replica is defined as any "audio rendering that is created or altered in whole or part using digital technology and is fixed in a sound recording or audiovisual work which includes replications, imitations, or approximations of an individual that the individual did not actually perform." This includes "the actual voice or a simulation of the voice of an individual, whether recorded or generated by computer, artificial intelligence, algorithm, or other digital means, technology, service, or device." These definitions go way beyond using AI to create a fraudulent ad endorsement or musical recording. They're broad enough to include reenactments in a true-crime show, a parody TikTok account, or depictions of a historical figure in a movie.


Navigating digital transformation in insurance sector: Challenges, opportunities and innovations

Notable developments are the changes that regulators have come up with in cybersecurity, the Information Security Management, the connectivity management with the website, with the vendors and employees, and the digital transformation that they have pushed us to. Because today, everything is digitally handled, an employee actually meets a customer, and the customer fills the form digitally; there is no mechanical filling of forms, although that practice is still there in many parts of the country and in many companies also. Having said that the digital absorption has become higher in percentage. So, when we handle things digitally, you will have to think through, and therefore today, employees are forced to think through what are the controls they could have. Like we have introduced OTP, so each state’s customer is forced to think and answer questions on OTP. You have to give OTP for that, like a policy that I bought about last week, so I had to do six OTPs in that company. I was wondering why so many OTPs are required, but when I look at the way the processes were handled by the salesperson, it was quite effective and efficient, and at the same time, it’s all for the safety of the customer, that thought process is given to the customer.


Productivity Paradox: Productivity in the Age of Knowledge Work

We sometimes forget that every employee within an enterprise is, at their core, also a consumer. Their personal preferences, shaped by daily interactions with personal technology, inevitably spill over into their professional lives. Consequently, the ubiquity of Macs, iPhones and iPads in the consumer market has sparked a growing demand for these devices in the workplace. This shift has only been hastened by the “Bring Your Own Device” (BYOD) movement, wherein employees sought to use their trusted personal devices for professional tasks, yearning for the familiarity and ease of use they’ve grown accustomed to. Instead of resisting this tide, more and more forward-thinking enterprises are instead leaning in. For one, IT leaders have recognized that specific hardware platforms matter less these days as they shift more of their applications to the public cloud. Reliability is another major factor, especially for remote employees who don’t have an IT helpdesk at their disposal. And when our survey respondents were asked if they agreed or disagreed with the statement “Apple takes enterprise security, compliance, and privacy concerns more seriously than other vendors,” three-quarters of them concurred.


6 hot networking and data center skills for 2024

“The current rage in AI technology is more than just a fad,” Leary says. “It is delivering real measurable benefits to IT organizations and the businesses they serve. And there is no sign of slowdown on the horizon. Driven by pressing capacities and costs, physical data center designs are changing significantly” with generative AI. Organizations need IT staffers who can help assure that generative AI is provided the data, data processing, and data exchanges needed to deliver on its promise, Leary says. ... Knowledge about cyber security products and services—as well as the threats they guard against—never go out of fashion. Organizations are facing a barrage of threats against their networks and data centers, so finding people with related skills remains a high priority. “Companies are constantly having to pay attention to their security as more and more cyber attacks happen,” Vick says. “That is something that is not slowing down, so they are having to update their firewalls and other security features.” Enterprises are building out their security teams, in some cases looking for people to update their security posture with a variety of technologies.


Navigating data management modernisation to deliver the AI-ready tech stack of the future

Forward-thinking IT leaders already see a direct correlation between modernising the data management journey across the entire tech stack and facilitating the extraction of value from data at the speed and scale needed for real-time intelligence. The same Alteryx research suggests that digital transformation relating to AI and machine learning will be the number one characteristic of the future enterprise, and tech stack priorities are already shifting to reflect this. Generative AI, quantum computing and machine learning operations (MLOps) are cited as the technologies most likely to see the largest shift in accelerated adoption in the future. While it only seems a short time since the pandemic forced many organisations to accelerate transformation at breakneck speeds, the rise of AI-related technologies will reinfuse these transformations. Why? Because AI has lowered the barrier to delivering productivity gains by delivering data-driven insights with just a sentence or a prompt. With countless data-oriented AI technologies and intelligent systems already available, the ultimate goal of this transformation is to modernise the data management journey across the entire stack.


Continuous Quality Assurance: Strategizing Automated Regression Testing for Codebase Resilience

In times of QA software testing, automation regression processes can be enabled to autonomously identify any unexpected behaviors or regressions in the software. ... End-users anticipate a consistent and dependable performance from software, recognizing that any disruptions or failures can profoundly affect their productivity and overall user experience. The implementation of regression testing proves invaluable in identifying unintended consequences, validating bug fixes, upholding consistency across versions, and securing the success of continuous deployment. Through early identification and resolution of regressions, development teams can proactively safeguard against issues reaching end-users, thereby preserving the quality and reliability of their software. ... Automated regression testing can be strategized based on the complexity of the codebase for approaches like retesting everything, selective re-testing, and prioritized re-testing. Tools such as Functionize, Selenium, Watir, Sahi Pro, and IBM Rational Functional Tester can be used to automate regression testing and improve efficiency.


Sustainable Partnerships Pay Dividends

The first step in establishing highly effective partnerships, von Koeller says, is to identify what an organization hopes to achieve and establish a roadmap for meeting specific goals and metrics. This requires a focus on shared values and coordination across departments and groups. “You have to understand your footprint, understand what environmental impact an action has, and how to engage suppliers to achieve alignment around your targets,” she explains. Open and honest communication among partners is vital. Too often, larger companies fail to understand what suppliers can do and what they can’t do, particularly when they’re located in faraway countries, or a supply chain has numerous layers. Partners downstream and upstream face their own set of challenges -- environmental, political, and practical -- that can make it difficult to conform to strict standards. A particularly daunting aspect -- especially for smaller firms supplying raw materials or specialized components -- is onerous data collection and reporting requirements, Linich notes. As a result, smaller companies may require funding and technical assistance from larger partners, including aid in setting up software and IT systems that support sustainability.


Get started with Anaconda Python

The Anaconda distribution is a repackaging of Python aimed at developers who use Python for data science. It provides a management GUI, a slew of scientifically oriented work environments, and tools to simplify the process of using Python for data crunching. It can also be used as a general replacement for the standard Python distribution, but only if you’re conscious of how and why it differs from the stock version of Python. ... The most noticeable thing Anaconda adds to the experience of working with Python is a GUI, the Anaconda Navigator. It is not an IDE, and it doesn’t try to be one, because most Python-aware IDEs can register and use the Anaconda Python runtime themselves. Instead, the Navigator is an organizational system for the larger pieces in Anaconda. With the Navigator, you can add and launch high-level applications like RStudio or Jupyterlab; manage virtual environments and packages; set up “projects” as a way to manage work in Anaconda; and perform various administrative functions. Although the Navigator provides the convenience of a GUI, it doesn’t replace any command-line functionality in Anaconda, or in Python generally. 



Quote for the day:

"Leaders are more powerful role models when they learn than when they teach." -- Rosabeth Moss Kantor

Daily Tech Digest - January 17, 2024

Improving Supply Chain Security, Resiliency

Regulatory compliance plays a vital role in how cybersecurity strategies are built: Compliance mandates like GDPR and the NIST Cybersecurity Framework provide foundations for data protection, access control, and incident response. “With these baselines in place, organizations can ensure that there is a certain level of security across all supply chain partners, which reduces the overall risk landscape,” Bachwani says. “Compliance also fosters a culture of security, which drives continuous improvement.” He adds that the pressure to meet regulatory standards necessitates ongoing risk assessments, proactive risk management practices, and regular vulnerability patching, which prioritizes cybersecurity in decision-making. “Regulatory frameworks often come with heavy fines and reputational damage for those who do not comply,” Bachwani notes. “This incentivizes everyone within the supply chain to prioritize cybersecurity and invest in robust safeguards.” Christopher Warner, senior security consultant at GuidePoint Security, says regulatory frameworks often specify security controls and standards that organizations must follow.


Quantum entanglement discovery is a revolutionary step forward

This discovery opens the door to new quantum communication protocols, utilizing topology as a medium for quantum information processing. Such protocols could revolutionize how we encode and transmit information in quantum systems, especially in scenarios where traditional encoding methods fail due to minimal entanglement. In summary, the significance of this research lies in its potential for practical applications. For decades, preserving entangled states has been a major challenge. The team’s findings suggest that topology can remain intact even as entanglement decays, offering a novel encoding mechanism for quantum systems. Professor Forbes concludes with a forward-looking statement, saying, “We are now poised to define new protocols and explore the vast landscape of topological nonlocal quantum states, potentially revolutionizing how we approach quantum communication and information processing.” ... It’s a physical process where pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently of the state of the others, even when the particles are separated by a large distance.


Staffing levels: are data centers at risk of unnecessary outages?

As for whether there were sufficient staff onsite during the Microsoft outage, and what should be the optimal number of staff present, John Booth, Managing Director of Carbon3IT Ltd, and Chair of the Energy Efficiency Group of the Data Centre Alliance, says it very much depends on the design and scale of the data center, as well as on the level of automation for monitoring and maintenance. Data centers are also often reliant on outsourced personnel for specific maintenance and emergency tasks and offer a 4-hour response. Beyond this, he suggests there is a need for more information to determine whether 7 staff were sufficient but admits that 3 members of staff are usually the norm for a night shift, “with perhaps more during the day depending on the rate of churn of equipment.” Davis adds that there is no reliable rule of thumb because each and every organization and site is different. However, there are generally accepted staff calculation techniques that can determine the right staffing levels for a particular data center site. As for the Microsoft incident, he’d need to formally do the calculations to decide whether 3 or 7 technicians were sufficient. It’s otherwise just a guess.


Projecting 2024 Cybertrends and C-Suite Responsibilities

Organizations must comply with various regulations and standards, such as the EU General Data Protection Regulation (GDPR), the US State of California Consumer Privacy Act (CCPA), the Payment Card Industry Data Security Standard (PCI DSS), and the US Health Insurance Portability and Accountability Act (HIPAA). Non-compliance can result in fines, legal action, or reputational damage. Compliance can be achieved if C-suite executives establish a compliance framework that requires them to assess and monitor their compliance status and implement necessary policies and procedures. They should also stay up to date on the changing regulatory and compliance landscape and engage with regulators and policymakers.The persistent cybersecurity skills gap is the shortage of qualified and experienced cybersecurity professionals on the job market. The cybersecurity skills gap can affect the ability of organizations to prevent, detect, and respond to cyberthreats. To help fill the skills gap, C-level executives should invest in the recruitment, retention, and development of their cybersecurity talent, and offer competitive compensation and benefits.


Here’s what you should look for in an OKR Management Tool

Communication is central to ensuring the success of any goal-setting framework. Make sure the technology you are leveraging allows the capturing of feedback, thoughts and comments on an ongoing basis. Using Keka’s OKR tool, teams can engage in meaningful discussions, share insights, and offer feedback directly on objectives and key results, fostering a culture of transparency and continuous improvement via the comments and 1 on 1 meeting feature. This functionality also empowers teams to set their own aligned goals, tailoring objectives to their unique strengths and challenges while still contributing to the larger organisational mission. ... Reminders about OKRs are highly advantageous as they keep objectives and key results at the forefront of individuals' and teams' attention, minimising the risk of goals becoming overlooked or forgotten during daily tasks. These reminders serve as nudges, encouraging consistent progress tracking, timely updates, and proactive adjustments. By maintaining goal visibility and urgency, this feature ensures that teams stay on track, deadlines are met, and alignment with broader strategic objectives remains strong, ultimately driving improved goal achievement and organisational success.


The CISO’s guide to accelerating quantum-safe readiness

With a dynamic perspective of their enterprise-wide cryptographic usage, CISOs can begin the work of cybersecurity risk assessments. This step involves working with cybersecurity and privacy managers to prioritize sensitive and critical data sets most at risk from “harvest now, decrypt later” attacks and with the highest business value and impact. To translate these insights into a quantum-safe strategy, security leaders should evaluate the business relevance in relation to the complexity of mitigation for specific assets so that they can plan their quantum-safe transition in a way that optimizes performance, compatibility and ease of integration. ... The final step in the journey to quantum-safe security is the transformation of cryptographic infrastructure to incorporate quantum-resistant cryptography. Before deploying quantum-safe solutions to their stack, security leaders should equip their teams with the tools and education to test the new cryptographic protocols and evaluate the potential impact on systems and performance. Quantum-safe solutions that can be updated without having to overhaul their cybersecurity infrastructure will help CISOs establish crypto-agility and ensure they can proactively and seamlessly address potential quantum vulnerabilities.


Magic Keyboard vulnerability allows takeover of iOS, Android, Linux, and MacOS devices

“The user does not have to have a keyboard paired with their phone already. And as long as Bluetooth is enabled on the Android device, at any time the phone is on them, and Bluetooth is on, the attacker can then force pair an emulated keyboard with the Android device and inject keystrokes, including at the lock screen.” Newlin then turned to Linux. “It turns out that the Linux attack is very, very similar,” he said. “On Linux, as long as the host is discoverable and connectable over Bluetooth, the attacker can force-pair a keyboard and inject keystrokes without the user’s confirmation. And so, this is distinct from Android in that the device has to be not only connectable but also discoverable and connectable on Linux for the attack.” Linux fixed this bug in 2020 but left the fix disabled by default. ... Newlin encourages security researchers to continue probing Bluetooth flaws. “I think it’ll probably be a while [before the full extent of Bluetooth flaws is known] because it will take the community actually fleshing these out and identifying all these additional effective systems beyond what I’ve seen myself,” he said.


How Edge Analytics Can Deliver the Competitive Edge Your Business Needs

Traditional data analytics models struggle to keep up with all the data that’s being generated. Traditional data analytics is also no match for today’s data velocity. As the speed at which data is created continues to grow, there will be an even greater need for real-time processing. The interpretation and application of real-time analytics can vary based on the specific industry and its requirements. Real-time analytics is a broad concept that is adapted to suit the needs of different industries and sectors. ...  By addressing these traditional data analytics challenges, edge analytics is becoming more prominent. It’s a natural progression -- taking data and business where they need to go now. ... Businesses can move faster with edge analytics because of its reduced latency. This is possible because edge analytics processes data closer to where it was generated, so organizations get data insights quicker. Reduced latency is particularly critical for applications that require real-time response such as battlefield scenarios, fraud detection, and supply chain management. Because edge analytics reduces the data load on the network, it also saves energy, reduces carbon emissions, and helps organizations meet their sustainability goals to protect the planet.


How OpenAI plans to handle genAI election fears

For its part, OpenAI said ChatGPT will redirect users to CanIVote.org for specific election-related queries. The company is also focusing on enhancing the transparency of AI-generated images using its DALL-E technology with plans to incorporate a "cr" icon on such photos, signaling they are AI-generated. The company also plans to enhance its ChatGPT platform by integrating it with real-time global news reporting, including proper attribution and links. The news initiative is an expansion of an agreement made last year with the German media conglomerate Axel Springer. Under that deal, ChatGPT users gain access to summarized versions of select global news content from Axel Springer's various media channels. ... There's no universal rule for how genAI should be used in politics. Last year, Meta declared it would prohibit political campaigns from using genAI tools in their advertising and mandate that politicians reveal any such use in their ads. Similarly, YouTube said all content creators must disclose whether their videos contain "realistic" but altered media, including those created with AI.


Storytelling for CIOs: From niche to bestseller

“For a CIO, or anyone in a senior position with responsibility for data, the best way to succeed is to make projects come to life,” says Caroline Carruthers, formerly a pioneering chief data officer at Network Rail, which manages train stations and infrastructure in the UK, and now CEO of data consultancy Carruthers and Jackson. “You can give people all the dashboards, charts and figures in the world, but it’s when you help them understand the thinking behind what you do and bring it to life that you get the buy-in you need.” Often, CIOs use stories as a form of Esperanto or a translation layer. “I always find there’s benefit in using a story to help my audience understand what can sometimes be very technical concepts that I’m trying to communicate to non-technical people,” says Adam Miller, CIO of UK insurer, Markerstudy Group. “Get the story right, then people understand the plan and you’ve a much better chance of them buying in. I also find that a good story is just as important for highlighting the impact of inaction too, which can often be the easiest option for people to take.”



Quote for the day:

"Leadership Seductions are behaviors or attitudes in which we become 'stuck'" -- Catherine Robinson-Walker

Daily Tech Digest - January 16, 2024

Why Pre-Skilling, Not Reskilling, Is The Secret To Better Employment Pipelines

In a landscape where the relevance of skills evolves, Zaslavski says that organizations should focus on selecting and advancing individuals based on their potential for learning skills like critical thinking and resiliency, instead of focusing on hard skills like coding. ... “By concentrating on these fundamental elements, as opposed to current technical proficiency or past work history, organizations position themselves with an agile and future-ready workforce. In this light, pre-skilling should be an integral part of employers’ talent strategy pre and post-hiring, from sourcing and recruiting to career pathing and employee engagement.” ... She points to areas like understanding if a potential or existing employee has the EQ and social skills needed to perform as part of a group. Or whether they have the curiosity and analytical intelligence needed to learn new hard skills as well as the ambition and work ethic to achieve results. “When people have learning ability, drive, and people skills, they will probably develop new skills faster than others,” she says.


Agile is a concept we all continuously talk about, but what is it really?

Empiricism, teams, user stories, iterations; they are all examples of tools that we use in Agile, but they are not its purpose. Agile is about empowering people to take control of their environment and give them complete freedom to discover how to use available tools in the most effective way. And this applies to the why too. People adopt Agile to increase efficiency, transparency, velocity, predictability, quality. But again all these are a result of Agile, not its goal. It is the mindset that makes it all possible. That is why it is “People and interactions above processes and tools”. To illustrate this, think about empiricism itself. Try introducing empiricism into an organisation mired in a culture of fear and control, and it doesn’t work, no matter what you do. You can’t force empiricism. People are too busy evading blame and manipulating information. Think about it, how often do people complain that the retrospective doesn’t deliver anything? Retrospectives where people just complain and nothing changes? 


What Will It Take to Adopt Secure by Design Principles?

What does the future of secure by design adoption look like? CISA is continuing its work alongside industry partners. “Part of our strategy is to collect data on attacks and understand what that data is telling us about risk and impact and derive further best practices and work with companies, and really other nations, to adopt these principles,” Zabierek shares. International collaboration on secure by design is reflected not only in this CISA initiative but also the Guidelines for Secure AI System Development. CISA and the UK’s National Cyber Security Centre (NCSC) led the development of those guidelines, and 16 other countries have agreed to them. But like the Secure by Design initiative, this framework is also non-binding. A software manufacturer’s timeline for adopting secure by design principles will depend on its appetite, resources and the complexity of its products. But the more demand from government and consumers, the more likely adoption will happen. Right now, CISA has no plans to track adoption. “We're more focused on collaborating with industry so that we can understand best practices and recommend further better guidelines,” says Zabierek.


Mastering the art of motivation

Once you’ve helped employees connect their dots, the best way to further motivate them is also the cheapest, easiest, and has the fewest unintended consequences. Compliment them on a job well done, whenever they’ve done a job well enough to be worth noting. Sure, there are wrong ways to use compliments as motivators. First and foremost the employee you’re complimenting must value your opinion. If they don’t they’ll write off your compliment as just so much noise. Second, a compliment from you should not be an easy compliment to earn. “I really like your belt,” isn’t going to inspire someone to work inventively and late. Third, with few exceptions compliments should be public. There’s little reason for you to be embarrassed about being pleased with someone’s efforts. With one caveat: Usually you’ll have one or two in your organization who routinely perform exceptionally well, but also one or two who are plodders — good enough and steady enough to keep around; not good enough or steady enough to earn your praise. Find a way to compliment them in public anyway — perhaps because you prize their reliability and lack of temperament.


Do you need GPUs for generative AI systems?

GPUs greatly enhance performance, but they do so at a significant cost. Also, for those of you tracking carbon points, GPUs consume notable amounts of electricity and generate considerable heat. Do the performance gains justify the cost? CPUs are the most common type of processors in computers. They are everywhere, including in whatever you’re using to read this article. CPUs can perform a wide variety of tasks, and they have a smaller number of cores compared to GPUs. However, they have sophisticated control units and can execute a wide range of instructions. This versatility means they can handle AI workloads, such as use cases that need to leverage any kind of AI, including generative AI. CPUs can prototype new neural network architectures or test algorithms. They can be adequate for running smaller or less complex models. This is what many businesses are building right now (and will be for some time) and CPUs are sufficient for the use cases I’m currently hearing about. CPUs are more cost-effective in terms of initial investment and power consumption for smaller organizations or individuals who have limited resources. 


How to create an AI team and train your other workers

Building an genAI team requires a holistic approach, according to Jayaprakash Nair head of Machine Learning, AI and Visualization at Altimetrik, a digital engineering services provider. To reduce the risk of failure, organizations should begin by setting the foundation for quality data, establish “a single source of truth strategy,” and define business objectives. Building a team that includes diverse roles such as data scientists, machine learning engineers, data engineers, domain experts, project managers, and ethicists/legal advisors is also critical, he said. “Each role will contribute unique expertise and perspectives, which is essential for effective and responsible implementation,” Nair said. "Management must work to foster collaboration among these roles, help align each function with business goals, and also incorporate ethical and legal guidance to ensure that projects adhere to industry guidelines and regulations." ... It's also important to look for people who like learning new technology, have a good business sense, and understand how the technology can benefit the company.


Data is the missing piece of the AI puzzle. Here's how to fill the gap

Companies looking to make progress in AI, says Labovich, must "strike a balance and acknowledge the significant role of unstructured data in the advancement of gen AI." Sharma agrees with these sentiments: "It is not necessarily true that organizations must use gen AI on top of structured data to solve highly complex problems. Oftentimes the simplest applications can lead to the greatest savings in terms of efficiency." The wide variety of data that AI requires can be a vexing piece of the puzzle. For example, data at the edge is becoming a major source for large language models and repositories. "There will be significant growth of data at the edge as AI continues to evolve and organizations continue to innovate around their digital transformation to grow revenue and profits," says Bruce Kornfeld, chief marketing and product officer at StorMagic. Currently, he continues, "there is too much data in too many different formats, which is causing an influx of internal strife as companies struggle to determine what is business-critical versus what can be archived or removed from their data sets."


3 ways to combat rising OAuth SaaS attacks

At their core, OAuth integrations are cloud apps that can access data on behalf of a user, with a defined permission set. When a Microsoft 365 user installs a MailMerge app to their Word, for example, they have essentially created a service principal for the app and granted it an extensive permission set with read/write access, the ability to save and delete files, as well as the ability to access multiple documents to facilitate the mail merge. The organization needs to implement an application control process for OAuth apps and determine if the application, like in the example above, is approved or not. ... Security teams should view user security through two separate lenses. The first is the way they access the applications. Apps should be configured to require multi-factor authentication (MFA) and single sign-on (SSO). ... Automated tools should scan the logs and report whenever an OAuth-integrated application is acting suspiciously. For example, applications that display unusual access patterns or geographical abnormalities should be regarded as suspicious. 


Cloud cost optimisation: Strategies for managing cloud expenses and maximising ROI

Instead of employing manual resources, streamlining cloud optimisation through automation could bring enhanced resource savings to the table. The auto-scaling program offered by Amazon Web Services (AWS) is a shining example of how firms can effectively streamline their cloud optimisation in a short time. The program also enables swift optimisation in response to the changing resource requirements of systems and servers. ... At the planning stage, firms need to justify the cloud budget and ensure that unexpected spending is reduced to the minimum. The same approach has to be followed in the building, deployment, and control phases so that any unexpected rise in budgets can be adjusted promptly without throwing the entire financial control into a tizzy. All these steps will help organisations develop a culture of cost-conscious cloud adoption and help them perform optimally while keeping costs in check. ... Incorporating cloud cost optimisation tools is a strategic approach for organisations to streamline expenditures and enhance ROI. 


Pull Requests and Tech Debt

The biggest disadvantage of pull requests is understanding the context of the change, technical or business context: you see what has changed without necessarily explaining why the change occurred. Almost universally, engineers review pull requests in the browser and do their best to understand what’s happening, relying on their understanding of tech stack, architecture, business domains, etc. While some have the background necessary to mentally grasp the overall impact of the change, for others, it’s guesswork, assumptions, and leaps of faith….which only gets worse as the complexity and size of the pull request increases. [Recently a friend said he reviewed all pull requests in his IDE, greatly surprising me: first I’ve heard of such diligence. While noble, that thoroughness becomes a substantial time commitment unless that’s your primary responsibility. Only when absolutely necessary do I do this. Not sure how he pulls it off!] Other than those good samaritans, mostly what you’re doing is static code analysis: within the change in front of you, what has changed, and does it make sense? You can look for similar changes, emerging patterns that might drive refactoring, best practices, or others doing similar.



Quote for the day:

"All leadership takes place through the communication of ideas to the minds of others." -- Charles Cooley

Daily Tech Digest - January 15, 2024

Authentication is more complicated than ever

Even if posture is improved and stronger forms of MFA are invoked at login, attackers will constantly be looking for new holes to exploit. Therefore, it's important to put in place detection logic and checks for compromise. Ideally, detections should target known attack techniques, but also leverage ML/AI algorithms to detect anomalous or novel suspicious behavior. For example, knowing historical access patterns can highlight when credentials suddenly attempt access from a new device or location. Put differently, authentication can no longer be only about authentication. The decision to validate a credential must be more than a question of the right password and MFA. It must include the context and conditions of the request, checked and confirmed by policy each time. When identity-based attacks are detected, automated responses should be invoked. This can mean stepping up authentication requirements, revoking access, quarantining an identity until the situation is resolved, or executing more complex responses.


The Importance of Human-centered AI

Creating a functional and reliable AI requires a combination of domain and data science expertise with design acumen.Domain experts are particularly important when developing AI for the legal sector, as legal operations professionals, attorneys, and others bring highly valuable knowledge when training AI to deliver results for corporate legal departments (CLDs). Data scientists cleanse, analyze, and glean insights from large amounts of data. AI design strategists create systems, design prototypes, and assist in model building, all while focusing on delivering intelligence in a user-centric way. It’s impossible for an AI model to work optimally without all these individuals working together. For instance, a model built just by data scientists might technically work, but it probably won’t be focused on the user or their business needs. Meanwhile, a model created by an AI designer may not have the breadth of insights it could have if a data scientist and domain expert were also involved. It’s this diversity of human talent and perspectives that lays the initial groundwork for everything that organizations want in AI.


Green data centers: efforts to push sustainable IT developments

Modular designs reduce the need for significant infrastructure modifications by enabling the gradual development of data centre capacity. In addition to saving energy, using more energy-efficient servers, storage units, and networking hardware can provide greater scalability by lowering the requirement for extra power and cooling infrastructure. The data centre’s demand for cooling increases with its size and new technologies are adding to better efficiency and energy savings. Along with this, scaling up without consuming more energy is possible with the use of effective cooling techniques like liquid cooling. Optimising resource utilization and maximising scalability may be achieved by putting into practice effective data centre management techniques like load balancing and resource sharing. Server virtualization maximizes efficiency internally, lowering the requirement for physical equipment and energy usage. Real-time monitoring and modification of energy use is made possible by artificial intelligence and machine learning, which makes infrastructure more adaptable and efficient. 


Unravelling the Persistence of Legacy Malware: By Shailendra Shyam Sahasrabudhe

While the term “legacy” may evoke images of outdated systems and forgotten technologies, in the realm of cyber threats, it takes on a more sinister connotation. Legacy malware, often several years old, continues to haunt organizations, primarily due to the shrewd tactics employed by threat actors. Global organizations face a substantial threat due to the lax enforcement of security standards for IoT device manufacturers, exacerbated by the widespread presence of shadow IoT devices within enterprise networks. This significant risk is posed by the targeting of “unmanaged and unpatched” devices by threat actors, who often leverage these vulnerabilities to establish an initial foothold in the targeted environment. These threat actors, operating as de facto businesses, harbour a vested financial interest in extending the shelf life of their malware. This involves the recycling and repackaging of malicious code, coupled with innovative market strategies. Technical manoeuvres such as code recompilation, binary morphing, and the creation of fresh signatures to sidestep traditional antivirus defences are par for the course.


The 3 Paradoxes of Cloud Native Platform Engineering

Given the plethora of DevOps tools on the market, assembling the optimal toolchain can slow everyone down and lead to inconsistent results. The solution: ensure platform engineering teams build an IDP that includes the best set of tools for the tasks at hand. The goal of such a platform is to provide a “golden path” for developers to follow, essentially a recommended set of tools and processes for getting their work done. However, this golden path can become a straitjacket. When this golden path is overly normative, developers will move away from it to get their jobs done, defeating its purpose. As with measuring their productivity, developers want to be able to make their own choices regarding how they go about crafting software. As a result, platform engineers must be especially careful when building IDPs for cloud native development. Jumping to the conclusion that tools and practices that were suitable for other architectural approaches are also appropriate for cloud native can be a big mistake. 


Cloud Computing's Role in Transforming AML and KYC Operations

The biggest advantage is data centralization. Data is not scattered in different systems which allows compliance investigators to get a holistic view of information about a customer in one place and thereby speed the investigation process and decision-making. Cloud platforms allow for seamless storage at very low cost and also enable organizations with a lot more querying and analytical toolsets. This further aids in the compliance investigation process as the AML investigator gets a view of all the transactions and the trends analysis much faster. AML platform providers were also coaxed to shift from typical on-premise solutions to creating cloud-based platforms which could then be mere plug-and-play SaaS solutions for the FIs. These enabled real-time monitoring of transactions thus alerting of any suspicious activity almost immediately. Unified AML platforms on the cloud also allow collaboration across the AML process chain and the overall FI ecosystem. 


15 ways to grow as an IT leader in 2024

Di Maria says having a group of trusted advisors can help CIOs — or any professional — identify and correct deficits as well as hone and build up strengths. She advises CIOs to tap several executives from outside their current organization, including those from other functional areas and industries, so that CIOs can gain from their diverse experiences and perspectives. ... Di Maria also recommends CIOs create an executive brand this year, if they haven’t done so already. “This helps you be a better leader and help you advance, because it has you focus on what you stand for,” she explains. “It helps you focus on how you show up and what you do so you’re more effective in your job. It helps you figure out what you should be doing, what your priorities are, and how what you’re doing provides value in your workplace.” ... As tech leaders, CIOs are instrumental in leading people through that change — and they must be better at it than they’ve been in the past, says Jason Pyle, president and managing director of Harvey Nash US and Canada, an IT recruitment and consultancy firm. “It will come down to navigating all the human elements,” he says.


Flipping the BEC funnel: Phishing in the age of GenAI

Unfortunately, a significant majority of organizations appear ill-prepared to counter these emerging phishing threats. Chief among the concerns facing most organizations today is the record-high cybersecurity workforce gap, with an estimated need for an additional 4 million professionals worldwide to protect digital assets, as reported by ISC2. The same report reveals that nearly half (48%) of organizations today lack the tools and talent to respond to cyber incidents effectively. Furthermore, the ISC2 study shows that today’s cybersecurity professionals are feeling less than confident about the current threat landscape. A staggering 75% of them assert that the present threat landscape is the most formidable they’ve encountered in the past five years, and 45% anticipate that artificial intelligence (AI) will pose their greatest challenge in the next two years. This outlook underscores the urgency for organizations to fortify their cybersecurity defenses and adapt to the rapidly evolving nature of cyber threats. Our analysis found over 8 million phishing attempts successfully evaded native defenses in 2022 alone.


Eye on the Event Horizon

While multifactor authentication is crucial for securing online accounts, SMS OTP is not the most secure form of MFA. Other, more secure methods are more difficult to hack or replicate, making them a safer option for high-risk transactions. Using WhatsApp OTP as a solution to address SMS OTP security issues could be a simple but effective solution as it offers end-to-end encryption and is cheaper than SMS. Single Sign-On via Social Login is a good option for nonfinancial applications. ... It is important to choose the most secure and reliable authentication method to protect against fraud and financial losses. While hardware-based tokens are the most secure option, they can be inconvenient to carry. There are better alternatives available, such as biometric authentication, mobile authentication apps and FIDO standards. An authenticator app - a mobile application - provides an extra layer of security to your online accounts by generating time-based, one-time passwords or TOTPs. These passwords are used for two-factor authentication and help protect your accounts from unauthorized access.


5 ways QA will evaluate the impact of new generative AI testing tools

Several experts weighed in, and the consensus is that generative AI can augment QA best practices, but not replace them. “When it comes to QA, the art is in the precision and predictability of tests, which AI, with its varying responses to identical prompts, has yet to master,” says Alex Martins, VP of strategy at Katalon. “AI offers an alluring promise of increased testing productivity, but the reality is that testers face a trade-off between spending valuable time refining LLM outputs rather than executing tests. This dichotomy between the potential and practical use of AI tools underscores the need for a balanced approach that harnesses AI assistance without forgoing human expertise.” Copado’s Hannula adds, “Human creativity may still be better than AI figuring out what might break the system. Therefore, fully autonomous testing—although possible—may not yet be the most desired way.” Marko Anastasov, co-founder of Semaphore CI/CD, says, “While AI can boost developer productivity, it’s not a substitute for evaluating quality. Combining automation with strong testing practices gives us confidence that AI outputs high-quality, production-ready code.”



Quote for the day:

"Success does not consist in never making mistakes but in never making the same one a second time." --George Bernard Shaw

Daily Tech Digest - January 14, 2024

Quantum mechanics uncovers hidden patterns in the stock market

What does this mean for the stock market? It implies that higher volatility and a slower reversion to equilibrium amplify herding behavior among investors, especially during times of uncertainty and information asymmetry. The study goes further by testing this model with empirical data from the U.S. stock market. Using the growth rate of gross domestic product (GDP) and forecaster uncertainty as indicators for business cycles and economic uncertainty, respectively, they found a positive correlation between the power law exponent and the GDP growth rate, and a negative correlation with forecaster uncertainty. This confirms their theoretical predictions and highlights the role of economic uncertainty in linking business cycles with herding behavior in stock returns. ... “Our study shows that quantum mechanics can be a useful tool to understand the stock market, a complex system with many interacting agents. We hope that our study can inspire more interdisciplinary research that combines physics and finance to explore the hidden patterns and mechanisms of the stock market,” he states.


'We Never Upskill Fast Enough': NTT DATA Services CEO Bob Pryor on mastering change

It's always a challenge, and to be honest, we never upskill fast enough given the myriad of options available. However, we're heavily investing in training, development, and skilling across all levels. Retaining talent involves helping them acquire more advanced technologies and skills in high-demand disciplines. Individuals tend to find greater satisfaction in roles that require complexity over those that are simpler to master. Constantly evolving the mix of skills, technology, and labour is crucial. Take AI, for example—it doesn't eliminate labour; it enhances people's efficacy when working with AI. In healthcare, top oncologists use advanced AI algorithms for diagnosis, medical devices, and treatment. The challenge isn't whether they are displaced by technology but whether we're scaling them fast enough to use the advanced technologies we're investing in and developing. Working effectively with AI involves having people smart enough to ask the right questions—what to create, what questions to ask, and how to interpret language models. 


5 Ways To Upskill As A Leader And Gain Respect From Your Team

Leadership is about building relationships, not task lists. This year, upskill yourself by building these skills to develop a leadership style that inspires cooperation and motivation, not fear. ... Being polite shows the people around you that you respect them, and they are more likely to return the favor. It costs you nothing to be kind. A basic greeting can go a long way, as can asking about your employees’ weekends, family, etc. Remember to say please and thank you. Never interrupt when your employees are talking, and show that you respect their time, work, and ideas. ... Bossing people around doesn’t feel great long term. You know when there’s tension in your office and when people aren’t glad to see you. It’s not good for your mental health to spend nine hours a day (or more) with people who resent your presence. When you tap into your humanity to create better relationships with your employees and become a leader people enjoy working with, not only will you feel more respected as a person, but you’ll likely also enjoy the benefits of a happier workforce, such as higher productivity, better work and even higher profits.


Yes, We're Still Messing Up Hybrid Work. Here's Where Exactly We're Going Wrong.

Hybrid work environments are dynamic, and what works one day may not be effective the next. Managers must be trained to be flexible in their leadership approach, adapting to the varying needs of their team members. This adaptability also means being open to feedback and willing to continuously learn and evolve their management style. It involves understanding the unique challenges and opportunities of managing remote and in-office team members and being adept at creating a cohesive team culture that bridges the physical divide. Honing communication skills is another key focus. In a hybrid setup, clear and inclusive communication is paramount. Managers need to be adept at conveying their messages effectively across various digital platforms, ensuring that every team member, whether remote or in-office, feels equally involved and informed. ... Developing strategies for remote team building is equally important. Hybrid work models can lead to a sense of disconnection among team members.


It’s time to fix flaky tests in software development

Not only do flaky tests threaten the quality and speed of software delivery, they pose a very real threat to the happiness and satisfaction of software developers. Similar to other bottlenecks in the software development process, flaky tests take developers out of their creative flow and prevent them from doing what they love: creating software. Imagine a test passes on one run and fails on the next, with no relevant changes made to the codebase in the interim. This inconsistent behavior can create a fog of confusion, and lead developers down demoralizing rabbit holes to figure out what’s gone wrong. It’s a huge waste of time and energy. By addressing flaky tests, technology leaders can directly improve the developer experience. Instead of getting tangled up in a web of phantom problems that drain their time and energy, developers are able to spend more time on fulfilling tasks like creating new features or refining existing code. When erratic tests are eliminated, the development process runs much more smoothly, resulting in a more motivated and happier team.


Building Cybersecurity Resilience With the Power of Habit

Clear's principles and philosophy, advocating for small yet consistent changes, should resonate deeply with cyberprofessionals. These principles, while not originally intended for use in the cybersecurity realm, can be creatively applied to construct a robust framework for a resilient cybersecurity culture. Clear's principles can be adapted to the cultivation of cybersecurity habits. ... The journey can begin with the fundamentals, for example, the management of cloud access rights. This involves regularly reviewing who has access to what information or resources and why, revoking access rights when an employee changes roles or leaves the organization, and implementing the principle of least privilege, wherein users are given the minimum levels of access necessary to perform their jobs. These minor changes, when consistently applied, can become the building blocks of an enterprise’s cybersecurity framework. The cumulative effect of such microchanges can be surprising. 


Customer Experience Is King, but CIOs Could Do More to Help

The very nature of how customer experience projects get defined and shepherded places IT at the back of the room, as an executor of tasks but not as a strategic leader. Is this bad? Not necessarily, considering that the end business units interacting with the customer ostensibly have expertise in dealing with customers, and are in the best position to know what customers want. However, as technology becomes a more integral element of the selling, informing, fulfillment and servicing of customers, there also is unique expertise that IT brings to the table. It can be invaluable in improving the customer experience, and that can also avert disaster. Being able to sell non-stop, 24/7 to worldwide customers is a major driver of e-commerce, as is the ability to provide customers with self-service options that can reduce internal operational costs for companies. Analytics, which can assess an individual customer or demographic buying habits and anticipate what customers will want to buy next are seen as beneficial. 


Leveraging Chaos Engineering To Test The Resilience Of Distributed Computing Systems

It helps build the resilience of distributed computing systems and improves their ability to withstand unexpected disruptions. Read on to know how. Chaos engineering leverages the chaos theory to achieve this. Further, the chaos theory introduces random and unexpected behavior in a controlled manner to identify system weaknesses. How does it benefit organizations? By enabling them to identify system vulnerabilities even before they actually occur. As a result, an organization can proactively adopt measures to plug potential vulnerabilities and improve system stability. However, developers associated with a premier software development company use an innovative approach to chaos engineering. ... The concept might look similar to stress testing but they are not the same. There are some key differences. For one, the concept leverages the chaos theory to proactively identify system or network issues and correct them. It also tests and corrects all components at the same time. Here, developers associated with a software development company in New York tend to look beyond possible causes and obvious issues. 


Neither ‘Agile’ nor Architecture are Going Anywhere

Want to move the enterprise to little a or big A agile? Want to modernize the technology stack? Implement flex points in subsystems? Integration effectiveness? Harness information for outcomes? Deliver technology services? Event-Driven Architecture? Customer-Centric Design? Manage cross-system compatibility and quality attributes? Handle mergers and acquisitions well? Project/team thinking do not account for these outcomes. The product owner doesn’t understand them and the development lead is focused on speed, simplicity and delivery. They may not understand them either. Architecture connects big outcomes to little decisions. I have seen huge objectives brought low by simple development decisions. ... From the board room to the basement. From idea to outcome. In between operating responsibilities. In between competing business objectives. With partners. With vendors. With an ever changing technology adoption cycle. From finance to legal to customer impacts, it takes a LOT of fascilitation, discussion, decision making and prioritization to deliver a balanced advantageous technology strategy. 


Demystifying Cloud Trends: Statistics and Strategies for Robust Security

The Shared Responsibility Model is a security and compliance framework that defines the responsibilities of cloud service providers (CSPs) and cloud customers for securing every aspect of the cloud environment, including hardware, infrastructure, endpoints, data, configurations, settings, operating system (OS), network controls and access rights. In basic terms, this model helps clarify who is responsible for securing various aspects of the cloud infrastructure, services, and data. The division of responsibilities varies depending on the cloud deployment model. ... Implementing strong IAM practices enforcing the principle of least privilege to restrict access rights for users and systems and regularly reviewing and updating access permissions can have a major positive impact on an organization’s cloud security posture. It’s as simple as granting users and other cloud resources the authorization to access the required resources only to a required extent. Multi-factor authentication (MFA) adds an additional layer of security, ensuring that only authorized users have access to resources and data. 



Quote for the day:

"We become what we think about most of the time, and that's the strangest secret." -- Earl Nightingale