Daily Tech Digest - July 04, 2023

Banking Tech Forecast: Cloudy, With a Chance of Cyber Risk

The breakneck pace of adoption has also resulted in a shortage of security experts who understand the overlapping, yet unique needs of the two industries. The cybersecurity sector has faced a shortage of skilled security professionals for most of its existence. Cloud solutions help mitigate this, because security can be integrated into the infrastructure and managed in a centralized place, Betz said. "Even then, financial institutions are still expected to conduct due diligence and oversight of third parties. This ability to evaluate security in a complex environment requires a high level of skill, which will continue to be highly sought-after, she said. Hiring and retraining existing staff to meet the volume of needed workers is a challenge too, in addition to the regulatory landscape wanting to force multi-cloud infrastructure for resiliency, Leach said. This means that a financial institution may be required to support multiple cloud service providers that operate differently and have different approaches to security assets. 


Data Mesh: A Pit Stop on the Road to a Data-Centric Culture

The process behind these benefits is also noteworthy. Most importantly, the notion of data decentralization is deceptively simple, and potentially revolutionary. Think of how IT consumerization has upended traditional technology implementation: Where IT specialists once made all the decisions on which tools to buy for business professionals and dictated how all that hardware and software was to be used, those end users now call the shots. They freely buy the devices they want and download the apps they like, then wait for IT to catch up. This provides enormous benefits. With data mesh we’re seeing similar movement toward data democratization. When line-of-business teams and other constituencies within the enterprise gain unprecedented access, and even ownership, of business data that was previously guarded, it accelerates collaboration and enables custom strategies to solve specific business problems. Data access also becomes simpler when interfaces and navigation are not just user-friendly but attuned to the priorities of specific functions, rather than having a more generic or enterprise-wide approach.


5 key mistakes IT leaders make at board meetings

It’s important to avoid speaking technical jargon, but sometimes you’re asked to define a technical term or explain a technology. One approach both Puglisi and I recommend is to answer technical questions with analogies from your industry. We both worked in the construction industry, so, for example, we might help these executives understand Scrum in software development by comparing it to design-build and agile construction project methodologies. ... Sometimes you need a spark to create a sense of urgency, but don’t take this approach too far. I once heard a CISO say, “If you can’t convince the board, then scare them,” which might get a CISO a yes to an investment, but lose credibility over time. CISOs who are natural presenters and storytellers can connect with the board using these skills, but only if given sufficient time to use this approach. If presenting isn’t your best skill, or you only have a few minutes to present, storytelling may confuse directors, says Tony Pietrocola, president and co-founder of AgileBlue. 


Beyond Browsers: The Longterm Future of JavaScript Standards

Developers don’t want to write their code multiple times to run on different serverside runtimes, which they have to do today. It slows down development, increases the maintenance load and may put developers off supporting more platforms like Workers, Snell noted. Library and framework creators in particular are unhappy with that extra work. “They don’t want to go through all that trouble and deal with these different development life cycles on these different runtimes with different schedules and having to maintain these different modules.” That’s a disadvantage for the runtimes and platforms as well as for individual developers wanting to use tools like database drivers that are specifically written for one runtime or another, he pointed out. “We talk to developers who are creating Postgres drivers or MongoDB drivers and they don’t want to rewrite their code to fit our specific API. It would be great if it was an API that worked on all the different platforms.” WinterCG is trying to coordinate what Ehrenberg calls “a version of fetch that makes sense on servers” as well as supporting Web APIs like text encoder and the set timeout APIs in a compatible way across runtimes.


EU judgment sinks Meta’s argument for targeted ads

Article 6(1)(b) of the GDPR establishes that practice could only be justified on condition that if the data is not processed, the contract between the user and the service operator can’t be fulfilled. This contractual necessity for data processing is usually understood rather more narrowly. For example, it enables an online retailer to provide a customer’s address to a courier, which is clearly necessary data processing under the terms of the contract between the store and the customer. Meta had relied on Article 6(1)(b) as its main justification for data processing for targeted advertising by claiming that targeted advertising was part of the service it contractually owes its users – a clause it introduced as part of a change to its terms of service (ToS) made at the stroke of midnight on 25 May 2018, which is the precise second the GDPR first came into force. Max Schrems, founder of Austria-based data protection campaign group NOYB, argued that Meta seemed to have taken the view that it could just “add random elements” to the contract, i.e. its ToS, covering personalised advertising, to avoid offering users a yes or no consent option.


Cloud Equity Group’s Sean Frank Talks AI Mingling with the Cloud

What AI is going to allow cloud to do is really kind of predict those changes that are needed. So instead of reacting to slowdowns because you’re running out of memory, you’re running out of processing power -- it could predict when those things are going to happen based on analyzing historical data and then it can allocate the resources dynamically. For a small organization or a small environment, it may not sound like a big task but as you’re thinking about these larger enterprises that might be running hundreds or thousands of servers that are all running different programs and they interact, if one thing goes down it can affect the rest of the ecosystem. It really is a very important aspect in being able to help make sure that everything is up and can be maintained. From a maintenance standpoint as well, it really tends to a very significant benefit. Right now, maintenance in general, in IT and cloud-based infrastructure, is largely reactive. There’s a problem, the user reports the problem, and someone from the IT department will then investigate the problem.


Is Zero Trust Achievable?

Microsegmentation is often the biggest hurdle businesses will face when implementing Zero Trust and many decide to forego it. Because the architecture is now based on security need, it can be complex to implement, particularly over on-premise private networks which tend to be flat and have high levels of implicit trust. Microsegmentation projects can run on for months and failure rates are high. Forrester’s Best Practices for Zero Trust Microsegmentation found that out of the 14 vendors who attempted to microsegment their private networks, 11 failed and concluded that in order to succeed, senior management buy-in is needed that oversees the removal of implicit trust between identities. However, there are other issues that can see projects flounder. Many organisations have legacy systems in situ, for instance, that were designed to function on a perimeterised network so assume trust. Realistically, the business may have to build the ZTNA around these until they become are retired or replaced by cloud-based SaaS alternatives, suggests the National Cyber Security Centre in its Zero Trust Architecture Principles.


Data Governance: The Oft-Overlooked Pillar of Strong Data Quality

Ungoverned data typically originates because someone in the organization, such as a data analyst, produces data without establishing strong governance policies to control how the data will be shared, secured, and maintained over time. Data producers may also lack an understanding of the underlying data they are working with and the business rules for aggregating or disseminating various KPIs related to the data. The decision to share data without first sorting out these issues is not usually the result of malfeasance. On the contrary, ungoverned data often emerges because someone in the organization creates data that other people need, and in the rush to ensure that they can start using the data, sharing begins before anyone creates a plan in place for governing the data over the long term. It doesn’t help, either, that it’s common for businesses to have IT solutions in place that make it easy to share data, but not to govern it. The typical IT organization implements software that can store and distribute information across an organization, but the IT department has little or no knowledge of how different business units will use the data. 


Why cyberpsychology is such an important part of effective cybersecurity

Cyberpsychologists and enterprise cybersecurity practitioners both stress the need to better understand how people interact with technology to create a stronger cybersecurity posture. They point to statistics showing that most breaches involve some sort of human misstep. Verizon's 2023 Data Breach Investigations Report, for example, found that "74% of all breaches include the human element, with people being involved either via error, privilege misuse, use of stolen credentials or social engineering." As Huffman says, hackers "don't want to go toe-to-toe with your firewall. They don't want to challenge your antivirus, because that's very difficult, not when they can exploit the largest vulnerability on every network on the planet right now -- that's us, people. Cybercriminals are not just hacking computers; they are hacking humans. Because ... unlike computers, we actually respond to propaganda." Psychology gets at why humans do what they do, says Huffman, founder of cybersecurity services firm Handshake Leadership. 


CISA's New 'CyberSentry' Program to Tighten ICS Security

The CyberSentry program is part of the several cybersecurity provisions in the National Defense Authorization Act signed by U.S. President Joe Biden in December 2021. The act provisioned $768 billion in defense spending including the cybersecurity component of various national and federal agencies. The program aims to support CISA's efforts to defend U.S. critical infrastructure networks operators that support national critical functions such as power and water supply, banks and financial institutions and healthcare by monitoring both known and unknown malicious activity affecting IT and OT networks. The CyberSentry program is based on a mutual agreement between CISA and participating critical infrastructure partners. The program is voluntary and is provided at no additional fees or equipment costs to the participating partners. Under the program, CISA harnesses sensitive government information and provides visibility and mitigation of cyber threats targeting critical infrastructure. 



Quote for the day:

"You can only lead others where you yourself are willing to go." -- Lachlan McLean

Daily Tech Digest - July 03, 2023

5 Enterprise Architecture Principles For Digital Transformation

This involves creating the Business Architecture, which includes the Business Domains (BD) and Business Capabilities (BC). To begin, it's important to create a Business Capability Map (BC Map). This map helps assess the organization's current business capabilities and identifies the new BCs needed for the target BC map. The BC Map serves as a foundation for defining strategic areas of focus, such as improving skills, refining products, optimizing services, innovating business models, streamlining processes, setting key performance indicators (KPIs), and estimating implementation costs for Business Capabilities. ... In the realm of digital architecture, where data and application security reign supreme, the inclusion of comprehensive security measures from the very outset is imperative. Such measures need to span the entire architecture, encompassing a multitude of facets including authentication, multi-factor authentication, key management, single sign-on (SSO), authorization, auditing, logging, as well as the encryption of both data in transit and data at rest.


How CISOs can make themselves ready to serve on the board

There’s growing global momentum for public company and corporate boards to also start recruiting directors with relevant cybersecurity expertise, which is part of a broader trend of boards seeking recruits with any kind of technical expertise. A recent study from leadership consulting firm Spencer Stuart reported that about a third of ‘next-gen’ directors aged 50 and younger have technology backgrounds. Chenxi Wang, a longtime cybersecurity expert and venture capitalist is part of that wave. She was recruited to the board for MDU Resources Group, a US-based energy and construction materials firm, back in 2019. She says the company was attracted both by her cybersecurity acumen and her connections with the high-tech industry in general. “As a pretty traditional and longstanding company, they are actually very conscientious about bringing diverse mindsets and backgrounds to the boardroom,” she tells CSO. “And they wanted someone who’s connected with the West Coast high-tech industry to give them that perspective that a Midwest energy company may or may not have exposure to.”


Mobile Cyberattacks Soar, Especially Against Android Users

In terms of the types of mobile malware that's circulating out there, Kaspersky saw fewer mobile malware installers and less ransomware in the past year, but more banking Trojans, it stated in "The Mobile Malware Threat Landscape in 2022" report. "Cybercriminals are still working on improving both malware functionality and spread vectors," according to the report. "Malware is increasingly spreading through legitimate channels, such as official marketplaces and ads in popular apps. This is true for both scam apps and dangerous mobile banking malware." To put all of this into perspective, it should be noted that traditional computing platforms still attract the lion's share of the cybercrime pie. Kaspersky, for example, blocked more than 20 million malicious installers, spyware, and adware attacks on mobile devices over the last four quarters, but blocked more than 20 times that number against more common work platforms, such as Windows. However, the mobile threat vector is not as well protected. "In most cases, mobile devices represent a significant, unaddressed attack surface for enterprises," Zimperium's Keating says.


Cloud security: Sometimes the risks may outweigh the rewards

The UK National Cyber Security Centre set out 14 cloud security principles to help businesses of different sizes balance their needs to configure cloud services securely. Vendor lock-in can be a common issue for businesses. Cloud vendors will give you all the tools to make your life easier, but getting out of their systems if you decide to stop working with them will be really difficult if you rely too much on their infrastructure. Beyond the technological risks, another deciding factor is the general trust toward cloud providers and the “hyperscalers” such as AWS or Google, who can provide public and hybrid cloud services to large enterprise networks. Flexibility and the ability to configure your set-up to your specific needs could be lost on the cloud. If you are running on-premises, you have more flexibility to reconfigure things. Your commercial relationship with your cloud provider will dictate how flexible you can be with your cloud infrastructure, which can prevent you from fixing unsafe issues as they come. How much control and flexibility you want on your data storage should impact your cloud set-up.


The time to implement an internal AI usage policy is now

Due to the rapid adoption of AI, there is still a lack of national and global legislation, governance, and guidance on using it, particularly in a professional environment. And while common sense and a general understanding of IT security are good places to start, this is not sufficient to rely on. It is increasingly important therefore that organisations devote time and resource to developing internal use and misuse policies for employees using AI in the workplace so that information and integrity is protected. These policies are only possible with a commitment to staying informed through ongoing research and continuous knowledge sharing; collaboration between AI experts and cybersecurity professionals at other organisations is vital for a comprehensive and proactive approach to identifying and mitigating AI-related risks. Policies need to be reinforced with good training. This starts with regular sessions and skill development for cyber professionals on the current risks of AI and those that may emerge in the future. It is backed-up by role-appropriate education for employees throughout the organisation.


5 ways to step outside your comfort zone at work, according to business leaders

Simon Langthorne, head of customer relationship management at Virgin Atlantic, says the biggest transformational steps he's taken during his career are when he's stepped outside his comfort zone to embrace new opportunities. "Every time there's been something new, it's been about, 'How do you go out and find exciting things?'" he says. "It's about looking for opportunities that take you outside the space of your expertise and take you into a place where actually you can lead and grow." Langthorne says one of the big challenges that comes from moving outside your comfort zone is thinking about how you'll deal with senior manager-level conversations. "They're not your peers; they're the guys that you really look up to. But you're also trying to guide and steer them. They're excited by things like artificial intelligence, but they'll look to you as the subject matter expert," he says. Embracing new challenges -- such as taking the lead on emerging technology -- will mean you suddenly have new gravitas in an organization, even if it feels uneasy at first.


How new AI tools like ChatGPT can transform human productivity in the enterprise

Data serves as the lifeblood of modern enterprises, yet extracting meaningful insights from vast amounts of data can be a daunting task. Here, AI technologies such as machine learning and natural language processing come into play, enabling the analysis of data at scale, uncovering valuable patterns and providing actionable insights. For example, AI-powered analytics platforms can process customer data to identify trends, preferences and purchasing patterns, allowing businesses to deliver personalized experiences. ... AI has the potential to augment human decision-making by offering real-time, data-driven recommendations. Business leaders can use AI-powered predictive analytics models to forecast market trends, optimize inventory management and enhance supply chain efficiency. ... AI technologies play a vital role in facilitating seamless collaboration and knowledge sharing among employees, transcending geographical boundaries. For instance, AI-powered virtual assistants can schedule meetings, transcribe conversations and facilitate information retrieval, thereby enhancing teamwork and productivity.


8 problematic IT team members — and how to deal with them

When advising companies on digital transformations, Maura Charles, a product and leadership coach, often encounters managers who are unwilling to address toxic employees — and without intervention, the problem grows. “I see bad behavior swept under the rug regularly,” Charles says. “In technology, this often happens because companies hire solely on the basis of technical skills and experience and don’t consider the importance of communication, emotional intelligence, and growth mindset. These so-called soft skills are what make technology initiatives and products successful, though.” Her advice? Seek some outside perspective from another capable manager or other trusted source. “Seeing the impact of all of these types of behaviors on team productivity and morale, I often find that team or leadership retrospectives can help shine a light on challenges,” she says. “When leaders ignore these issues, they tend to fester, and the teams and outcomes suffer. By tackling the issues, you may avoid losing talented employees by showing that the people and their work environment matter.”


Will Your Business Soar With an Industry Cloud?

Campbell advises enterprises to begin their cloud journey by carefully studying their business strategy. “Understand where the most critical differentiation is necessary,” he suggests. “Industry clouds offer an opportunity to build differentiation around the most important capabilities.” An industry cloud can also provide a way to achieve faster time to value in less critical sectors to differentiate capability areas while reducing the amount of focus required. “It’s important to remember that the evaluation is iterative as the strategy and the market evolves, so continuous monitoring is important,” Campbell warns. Singh recommends looking for an industry cloud that’s tailored to a specific field from a provider with deep area expertise. “Find a provider that not only offers a cutting-edge functional footprint in your industry, but also supports components and best practices from other industries to help create true differentiation,” he suggests. Besides enterprise-wide support, an industry cloud can also be used to address a specific need. 


Why companies should consider having a chief trust officer

Should an internal review detail the absence of trust in engagements (internal or external), then perhaps boldness is needed to affect change from within. CISOs and CSOs need to ask the questions posited by Stewart and Harkins, both of whom were at the vanguard of the CTO evolution. Hitch up those pants and either work to expand your own role as necessary or push for the creation of the role for someone else. Regardless of the path taken internally, the CTO's role must be sufficiently broad in scope to encompass the necessary visibility across the entire entity's landscape. In a recent MIT Technology Review piece, Elena Kvochko, chief trust officer at SAP, highlighted that "trust is a clear competitive differentiator -- having a recognized awareness that this is an important function and this is an important direction for the company -- it was critical for our success." In closing, and as noted by Stewart, the role must include demonstrable responsibilities, accountabilities and above all the necessary authorities. 



Quote for the day:

"The greatest good you can do for another is not just share your riches, but reveal to them their own." -- Benjamin Disraeli

Daily Tech Digest - July 02, 2023

OpenAI, others pushing false narratives about LLMs, says Databricks CTO

“There are definitely the larger providers, like OpenAI, Google, and so on; they have this narrative – and they’re talking in a lot of places about how – first of all, this stuff is super dangerous, not in the sense of a disruptive technology, but even in the sense of ‘it might be evil and whatever’,” Zaharia told ITPro during an interview at Databricks AI and Data Summit 2023. “It’s very sci-fi.” “OpenAI – that’s exactly the narrative they’re pushing – but others as well. “Anytime someone talks about AI alignment or whatever, it’s often from this angle: Watch out, it might be evil. They’re also saying how it’s a huge amount of work to train [models]: It’s super expensive – don’t even try it. “I’m not sure either of those things are true.” Zaharia cited MosaicML – the startup Databricks recently acquired for $1.3 billion – as having trained a large language model (LLM) with 30 million parameters that’s competitive with GPT-3, and “probably cost like ten to 20 times less” to train.


Ransomware: recovering from the inevitable

There’s no doubt that businesses’ cybersecurity teams are under an immense amount of pressure in the battle against ransomware but they can only go so far alone. There must be an awareness that it simply can’t be stopped at the source, and that defending against ransomware takes a combination of people, processes and technology. The digital world can appear complex – especially in the case of large enterprise structures – so it can be helpful to stress that the digital world and the real world are not that different. Digital protections such as patching systems, multi-factor authentication, data protection and the risk of the insider threats all have real world counterparts: open windows that need to be locked at night, double locking your front door, locking away vital items in a safe, and opportunistic break ins through unlocked windows or doors. However, whilst using a combination of people, processes and technology to minimise attacks is key, some will inevitably slip through the cracks, which is where recovery comes into play.


AI Foundation launches AI.XYZ to give people their own AI assistants

The platform enables users to design their own AI assistants that can safely support them in both personal and professional settings. Each AI is unique to its creator and can assist with tasks such as note-taking, email writing, brainstorming, and offering personalized advice and perspectives. Unlike generic AI assistants from companies like Amazon, Google, Apple, or ChatGPT, each AI assistant designed on AI.XYZ belongs exclusively to its creator, knows the person’s values and goals, and provides more personalized help. The company sees a significant opportunity for workplaces and enterprises to provide each of their employees with their own AIs. ... AI.XYZ is available in public beta and can be accessed on the web with an invitation code. Creators can interact with their AIs through text, voice, and video. A free subscription to AI.XYZ allows users to get started creating their own AI, while a premium subscription for $20 per month allows additional capabilities and customization options. The AI Foundation has collaborated with top research institutions like the Technical University of Munich to create “sustainable AI” for everyone.


TDD and the Impact on Security

Outside-In Test-Driven Development (TDD) is an approach to software development that emphasizes starting the development process first by creating high-level acceptance tests or end-to-end tests that demonstrate the desired behaviour of the system from his point of view to define users or external interfaces. It is also commonly referred to as behaviour-directed development (BDD). With Outside-In TDD, the development process begins with writing a failed acceptance test that describes the desired behaviour of the system. This test is usually written from a user's perspective or a high-level component interacting with the system. The test is expected to initially fail as the system does not have the required functionality. Once the first acceptance test has been performed, the next step is to write a failing unit test for the smallest possible unit of code that will pass the acceptance test. This unit test defines the desired behaviour of a specific module or component within the system. The unit test fails because the corresponding code still needs to be implemented.


Wasm: 5 things developers should be tracking

One of Wasm’s biggest draws is its cross-platform portability. Wasm is a neutral binary format that can be shoved in a container and run anywhere. This is key in our increasingly polyglot hardware and software world. Developers hate compiling to multiple different formats because every additional architecture (x86, Arm, Z, Power, etc.) adds to your test matrix, and exploding test matrices is a very expensive problem. QE is the bottleneck for many development teams. With Wasm, you have the potential to write applications, compile them once, test them once, and deploy them on any number of hardware and software platforms that span the hybrid cloud, from the edge to your data center to public clouds. A developer on a Mac could compile a program into a Wasm binary, test it locally, and then confidently push it out to all of the different machines that it’s going to be deployed on. All of these machines will already have a Wasm runtime installed on them, one that is battle tested for that particular platform, thereby making the Wasm binaries extremely portable, much like Java.


Getting Started with Data Literacy: Two Tips for Success

How should an enterprise get started? Langer says he “came to the inescapable conclusion that data literacy must start with leaders. Data literacy isn't just for the rank-and-file.” As a litmus test when he starts talking to organizations, he asks about their leader's commitment to data literacy. “I ask them, ‘Is your organization willing to send your leaders to training -- managers, executives, the C-suite, all of them?’ If not, which is often the case, that probably tells you everything that you need to know, because data literacy is very much a cultural transformation. If your leaders aren't all in, then there's almost no point in getting started, to be frank. If employees see their managers not exhibiting a data literacy mindset and data literacy behaviors, they will revert to business as usual.” Langer admits to receiving pushback; executives wonder if data literacy is needed because newer technology such as no-code/low-code or generative AI already make it easier to gain insights.


How Data Observability Helps Shift Left Your Data Reliability

When you consider data observability, the term “shift left” refers to a proactive strategy that involves incorporating observability practices at the early stages of the data lifecycle. This concept draws inspiration from software development methodologies and emphasizes the importance of addressing potential issues and ensuring high quality right from the start. When applied to data observability, shifting left entails integrating observability practices and tools into the data pipeline and infrastructure right from the outset. This approach avoids treating observability as an afterthought or implementing it only in later stages. The primary goal is to identify and resolve data quality, integrity, and performance issues as early as possible, thereby minimizing the likelihood of problems propagating downstream. ... Taking a proactive approach to address data incidents early on enables organizations to mitigate the potential impact and cost associated with data issues. 


Architecting Real-Time Analytics for Speed and Scale

Apache Druid has emerged as the preferred database for real-time analytics applications due to its high performance and ability to handle streaming data. With its support for true stream ingestion and efficient processing of large data volumes in sub-second timeframes, even under heavy loads, Apache Druid excels in delivering fast insights on fresh data. Its seamless integration with Apache Kafka and Amazon Kinesis further solidifies its position as the go-to choice for real-time analytics. When choosing an analytics database for streaming data, considerations such as scale, latency, and data quality are crucial. The ability to handle the full-scale of event streaming, ingest and correlate multiple Kafka topics or Kinesis shards, support event-based ingestion, and ensure data integrity during disruptions are key requirements. Apache Druid not only meets these criteria but goes above and beyond to deliver on these expectations and provide additional capabilities.


Why business leaders must tackle ethical considerations as AI becomes ubiquitous

When it comes to ethical AI, there is a true balancing act. The industry as a whole has differing views on what is deemed ethical, making it unclear who should make the executive decision on whose ethics are the right ethics. However, perhaps the question to ask is whether companies are being transparent about how they are building these systems. This is the main issue we are facing today. Ultimately, although supporting regulation and legislation may seem like a good solution, even the best efforts can be thwarted in the face of fast-paced technological advancements. The future is uncertain, and it is very possible that in the next few years, a loophole or an ethical quagmire may surface that we could not foresee. This is why transparency and competition are the ultimate solutions to ethical AI today. Currently, companies compete to provide a comprehensive and seamless user experience. For example, people may choose Instagram over Facebook, Google over Bing, or Slack over Microsoft Teams based on the quality of experience. 


ChatGPT, compliance, and the impending wave of AI-fuelled content

Despite its convincing rhetoric, ChatGPT is, at times, deeply flawed. Quite simply, its statements can’t always be trusted. This is a reasonably devastating indictment for a tool which invites such vehement scrutiny, and has been acknowledged by OpenAI, who admit that “ChatGPT sometimes writes plausible-sounding but incorrect or nonsensical answers.” ChatGPT has a vast wealth of knowledge because it was trained on all manner of web content, from books and academic articles to blog posts and Wikipedia entries. Alas, the internet is not a domain renowned for its factual integrity. Furthermore, ChatGPT doesn’t actually connect to the internet to track down the information it needs to respond. Instead, it simply repeats patterns it has seen in its training data. In other words, ChatGPT arrives at an answer by making a series of guesses, which is part of the reason it can argue wrong answers as if they were completely true, and give different (incorrect) answers to the same questions.



Quote for the day:

"The mediocre leader tells The good leader explains The superior leader demonstrates The great leader inspires." -- Buchholz and Roth

Daily Tech Digest - July 01, 2023

CERT-In cyber security norms bar use of Anydesk, Teamviewer by govt dept

Cyber security watchdog CERTin has barred the use of remote desktop softwares like Anydesk and Teamviewer in the government department under new security guidelines released on Friday. The guidelines prescribe government departments use virtual private networks (VPN) for accessing network resources from remote locations and enable multi-factor authentication (MFA) for VPN accounts. "Ensure to block access to any remote desktop applications, such as Anydesk, Teamviewer, Ammyy admin etc," Guidelines on Information Security Practices for Government Entities said. CERT-In (Indian Computer Emergency Response Team ) said the purpose of these guidelines is to establish a prioritised baseline for cyber security measures and controls within government organisations and their associated organisations. Minister of State for Electronics and IT Rajeev Chandrasekhar in an official statement said the government has taken several initiatives to ensure an open, safe and trusted and accountable digital space.


Navigating Product Owner Accountability in Scrum: Debunking Myths and Overcoming Anti-Patterns

In a misguided attempt to ‘help’ Product Owners with their important responsibilities, some organizations establish two Product Owners for a single Product. However, while this may seem, at first, to be helpful, this actually causes a lot of problems for both Product Owners involved. When multiple Product Owners exist, conflicting ideas and visions may arise, diluting the product's direction and impeding progress. ... Instead, the Product Owner can delegate tasks such as creating Product Backlog items, maintaining the roadmap, or gathering metrics to Developers on the Scrum team. However, it is important to note that while the Product Owner may delegate as needed, the Product Owner ultimately remains accountable for items in the Product Backlog as well as the product forecast or roadmap, thus ensuring that there is a single, unifying vision and goal for the product and that the Product Backlog is in alignment with that vision. If the Product Owner is delegating the creation of Product Backlog items to Developers, what does that mean? 


Cisco firewall upgrade boosts visibility into encrypted traffic

“What our competitors are saying is ‘just decrypt everything.’ But we know in the real world, customers refrain from doing that due to data privacy concerns and to meet legal/compliance requirements. Furthermore, decrypting and re-encrypting data requires technical prowess not everyone has, increases the attack surface, and also causes severe performance challenges,” Miles said. EVE works by extracting two primary types of data features from the initial packet of a network connection, according to a blog written by Blake Anderson, a software engineer in Cisco’s advanced security research group. First, information about the client is represented by the Network Protocol Fingerprint (NPF), which extracts sequences of bytes from the initial packet and is indicative of the process, library, and/or operating system that initiated the connection. Second, it extracts information about the server such as its IP address, port, and domain name (for example a TLS server_name or HTTP Host). 


Scrum vs. Kanban vs. Lean: Choosing Your Path in Agile Development

While Scrum is commonly associated with software development teams, its principles and lessons have broad applicability across various domains of teamwork. This versatility is one of the key factors contributing to the widespread popularity of Scrum. Scrum is founded upon the concept of time-boxed iterations called sprints, which are designed to enhance team efficiency within cyclical development cycles. ... Kanban is well-suited for organizations seeking to embrace the benefits of agility while minimizing drastic workflow changes. It is particularly suitable for projects where priorities frequently shift, and ad hoc tasks can arise anytime. Kanban is a flexible methodology that can be applied to various domains and teams beyond software development. ... Lean methodology strongly emphasizes market validation and creating successful products that provide value to users. It is particularly well-suited for new product development teams or startups operating in emerging niches where a finished product may not yet exist, and resources are limited.


3 Ways to Build a More Skilled Cybersecurity Workforce

In addition to insights around highly sought-after skill sets and job titles, OECD's report also reveals that demand for cybersecurity professionals has spread beyond the confines of major urban centers. It calls for a more decentralized workforce to meet demand in underserved areas. ... If companies are to close the skills gap and meet the current demand for cybersecurity workers, they will need to broaden their horizons to account for more nontraditional cybersecurity career paths. In doing so, they will enhance the industry with a broader range of unique experiences and life skills. Recruiting more diverse candidates also allows companies to approach security challenges from different angles and identify solutions that may not have been considered otherwise. When a workforce is as diverse as the cybersecurity threats an organization faces, it can pull from a broader range of professional and personal experiences to more effectively and inclusively protect themselves and their end users.


AI's Teachable Moment: How ChatGPT Is Transforming the Classroom

"Teachers could say, 'Hey my students are really interested in TikTok,' then feed that to the AI," says Liu. "An AI could come up with three analogies related to TikTok that connect students to their needs and interests." Liu believes we absolutely need to acknowledge the immediate threats surrounding AI and its initial impact on teachers, particularly around skills assessments and cheating. One approach he takes is to speak openly with students and acknowledge that AI is the new thing and that we're all learning about it – what it can do, where it might lead. The more open conversations educators have with students, he says, the better. In the near term, students are going to cheat. That's impossible to avoid. YouTube and TikTok are bulging at the seams with tricks to help students avoid plagiarism trackers. In the medium term, Liu believes, we need to reevaluate what it means to grade students. Does that mean allowing students to use AI in assessments? Or changing how to teach topics? Liu isn't 100% sure.


Top 5 Benefits Of Blockchain Technology

Transparency within the Blockchain ecosystem refers to the open visibility of transactions, enabling all participants to validate and verify the recorded data. Unlike traditional systems that rely on centralized authorities, Blockchain operates on a decentralized network, where each transaction is recorded on a public ledger known as the Blockchain. ... Immutability is a cornerstone of Blockchain technology. It guarantees that once a transaction is recorded on the Blockchain, it becomes virtually impossible to alter or tamper with the data. This is achieved through a combination of cryptographic techniques and consensus mechanisms. Blockchain achieves data immutability by using cryptographic hashing. Each transaction is assigned a unique cryptographic hash, which is essentially a digital fingerprint. This hash is created by applying complex mathematical algorithms to the transaction data, resulting in a fixed-length string of characters. Furthermore, Blockchain relies on consensus mechanisms such as Proof of Work (PoW) or Proof of Stake (PoS) to validate and verify transactions.


Technical Debt tracking supports projects to “do it right”

For decades, there have been logs of outstanding bugs found in testing but not corrected before the project is implemented. The term technical debt adds the concept that there are consequences to those decisions, and that there are strong reasons to prioritize the follow-up to fix things and clear that list. Most of us are aware of workarounds that were left in place permanently and eventually cost too much. We may have seen a system with poor performance that slowed the work of key workers and/or was missing functionality that impacted the customer experience. All of these are important reasons that technical debt should be cleared up. There are other reasons too. Generally, people do not purposefully create poor designs or code with bugs. ... One of the interesting concepts that has been offered by Martin Fowler is the Technical Debt Quadrant that talks about the prudent but inadvertent technical debt that is created as we learn during a project and realize how the project should have been done.


Successful digital transformation requires simplistic thinking

While organizations are aggressively pursuing transformation goals, Chaudhry warned that antiquated mindsets and a range of internal factors can seriously inhibit innovation and prevent businesses from achieving their goals. Most notable among these is a complacent culture among some IT leaders who are stuck in a loop of traditional, outdated practices. “IT plays the most important role in driving transformation. You play the most important role, but you also need to act fast to drive change,” he said. “You can’t sit back and say ‘this is how things have been done for the last 30 years, so let’s keep doing so’.” ... Inertia, as he puts it, is a powerful inhibitor that locks IT leaders and organizations into an outdated mindset which prevents them from embracing change. “Inertia is powerful, and it holds you back because we are comfortable with what we’ve been doing for the last 10, 20 years or so,” he said. Research has often identified inertia as a common inhibitor in digital transformation, whereby teams are reluctant - or unwilling - to accept change.


Strategies to drive the Data Mesh cultural transformation

It’s important to have consistent and clear communication to ensure that everyone understands the reasons and the effects of change. Leaders must communicate the vision and benefits of Data Mesh. They also need to guide on how the new ways of working are going to be adopted through well-defined structures, roles and responsibilities for the new data product teams. To ensure data product ownership and accountability, defining clear KPIs and metrics for each data product team to measure success and track progress is critical. ... Rather than trying to adopt Data Mesh all at once, organizations can start with small pilot projects and gradually expand. This approach can help understand how processes defined in vitro work in real life. It also comes with lessons learned which help followers avoid the initial mistakes. ... This ensures that everyone in the organizationunderstands the new concepts and ways of working. It could include training sessions and coaching on Data Mesh, product thinking, design, user research, agile methodologies, cross-functional team collaboration, and data product ownership.



Quote for the day:

"Leaders need to strike a balance between action and patience." -- Doug Smith

Daily Tech Digest - June 30, 2023

3 things that make a CIO-CFO dream team

A study conducted by Gartner, detailed in the report “CIOs: Improve How You collaborate With Your CFO,” found that when CFOs are asked how well their most senior IT executive understands the impact of technology on finance, more than half indicate that their IT counterparts are lacking in this area. But surprisingly few companies choose CIOs for their financial skills. “Financial knowledge is not something clients typically ask for when recruiting a CIO,” says Thistle. “However, the CIO will be expected to understand and manage IT costs and budgets, both Capex and Opex.” ... “Even when there is a CFO of IT, the person at the top, the CIO, still needs to understand finance,” he says. “Most CIOs don’t have the benefit of a background in finance. I’ve never met a CIO who went into IT to manage money, yet that’s what they have to do. They have to run IT like a business within a business.” The biggest challenge is not getting a handle on cost, but on value. CIOs can easily show cost on a general ledger. But estimating the future value of technology is more art than science. Investment decisions need to be driven by business outcomes that can be measured and, ideally, monetized. 


‘Shadow’ AI use becoming a driver of insider cyber risk

“People don’t need to have malicious intent to cause a data breach,” said Ray. “Most of the time, they are just trying to be more efficient in doing their jobs. But if companies are blind to LLMs accessing their back-end code or sensitive data stores, it’s just a matter of time before it blows up in their faces.” Insider threats are thought to be the underlying reason for almost 60% of data breaches, according to Imperva’s own data, but many are still not properly prioritised by organisations since a not insignificant number of them are simply cases of human error – a recent study by the firm found that 33% of organisations don’t perceive insiders as a significant threat. Ray said trying to restrict AI usage inside the organisation now was very much a case of shutting the stable door after the horse had bolted. “Forbidding employees from using generative AI is futile,” said Ray. “We’ve seen this with so many other technologies – people are inevitably able to find their way around such restrictions and so prohibitions just create an endless game of whack-a-mole for security teams, without keeping the enterprise meaningfully safer.”


Generative AI may help make 'low-code' more 'no-code' - but with important caveats

AI will ultimately serve "as a way to enable low-code and no-code environments," says Leon Kallikkadan, vice president of technology at Atrium. "I also think that as other partnerships can come onboard it will make low-code and no-code more of a possibility. I believe it will be a phased approach whereby as you, the human developer builds it, an AI component will start creating a vision or future step. The long-term possibilities depend on how deep the integration is, but yes, it can go that far to become a low-code, no-code environment." No and low-code solutions may be a good fit for non-technical users. "Low code is more geared towards non-coders," says Jesse Reiss, CTO of Hummingbird. "It provides organizations with the ability to reimagine business processes without obtaining steep IT expertise. This is crucial for small- to medium-sized businesses, especially during the ongoing labor challenge where they can be short-staffed or do not have the resources to support business operations." Generative AI is more suitable for development work requiring high-level expertise, experts state.


Top Issues Architecture Leaders Need to Address in 2023

Over the next five years, leaders need to be aware that the architect resource shortage will not improve. Resources may be unavailable in the marketplace as you look to refill your bench. Today, there are 10 to 20 open positions for every available architect looking for a job, and the future job market looks bleak. This resource shortage means architecture leaders will either need to develop the skills and experiences internally or they will need to look at how they utilize technology to do more with fewer people, and most probably a combination of both. If you’re looking to do more with less or training new architects, determine now how to maintain the tribal knowledge of your senior architects. ... Most of today’s architects analyze in Excel or the standalone modeling tools they work in. When architects are only looking at a minimal set of information, they are missing the broad operational data available across the organization, which are found in systems like CMDB, CRMs, ERPs, HR solutions, and facility management systems to gather critical operational data about what’s going on in terms of manufacturing processes, business processes, org structures, costs, and more.


SEC notice to SolarWinds CISO and CFO roils cybersecurity industry

The move by the SEC will make CSOs more individually accountable for cybersecurity, said Agnidipta Sarkar, a former CISO of pharmaceuticals company Biocon. "Though it doesn’t mean that the CISO has been charged, it is a new milestone. From today onwards, CISOs will increasingly be made accountable for the decisions they take or did not take," Sarkar said. However, attributing blame solely to the CISO or CFO might not always be fair or accurate, said Ruby Mishra, CISO at KPMG India. "In order to manage cybersecurity effectively, the organization adopts a multilayered approach involving various stakeholders and departments. Holding the CISO or CFO solely responsible for a cyberattack may overlook the collective responsibility," Mishra said. ... "Before issuing the notice, the SEC may have considered a variety of factors, including specific circumstances, and legal frameworks, or may have demonstrated negligence if CISO failed to implement adequate security measures, neglected SEC policies, guidelines, and practices, or ignored known vulnerabilities," Mishra said.


3 Initiatives CISOs Can Take for Their Security and Resilience Journey

Businesses can help reduce the risk of a data breach by creating the right cyber defense and recovery plans. This comprehensive strategy should include the following: A risk assessment of the IT environment’s threat landscape; An incident response plan that defines in detail the procedures to follow after a breach; A business continuity plan that outlines how to recover from a breach as quickly and gracefully as possible. According to the U.S. Department of Defense, “zero trust” means that organizations should “never trust, always verify” (DOD CIO, 2022). Rather than granting indiscriminate access to applications, devices, and other IT assets, businesses should give users only the resources they need when they need them. In a zero-trust approach, all users, devices, and applications are treated as potentially compromised, with the organization’s defenses locked down accordingly. Techniques may include strict access controls, multifactor authentication (MFA), and monitoring user activities. Certified CISOs should act to define a zero-trust strategy that aligns with the organization’s IT governance and compliance requirements.


Proxmox 8: New Features and Home Lab Upgrade Instructions

Proxmox VE (Proxmox Virtual Environment) is an open-source server virtualization management solution allowing users to manage virtual machines, such as Windows or Linux machines and Linux containers. It’s based on the Debian Linux distribution and combines two virtualization technologies, KVM (Kernel-based Virtual Machine) and LXC (Linux Containers), managed through a web-based interface. The Proxmox platform is used in virtual environments to improve efficiency and ease management tasks. It allows users to deploy, manage, and monitor virtual machines (VMs) and containers, network settings, storage systems, and more, all from a single, integrated platform. Proxmox also provides high-level features like live migrations of VMs without downtime, high availability, or automated backups, making it a robust choice for managing virtual environments, whether for small businesses or larger enterprises. Its open-source nature allows for active community involvement and provides a cost-effective solution for virtualization needs.


Secret CSO: Dan Garcia, EnterpriseDB

It’s important to surround yourself with people who are there to support you and push you to be the best that you can. Having a strong support system is vital. Along the way I had many mentors, some who played an important role for where I was at the time. Mandy Andress who is the CISO at Elastic, provided me with my opportunity within Security Operations at MassMutual and I’ll always be grateful for that chance. ... Balance. Information security is one of the few areas within the business that cuts through multiple departments, functions, skill sets, and problems that need attention. Finding the balance of where to spend your time and resources can be challenging, but it’s an important thing to get right in order to most effectively solve organisational problems. ... Hiring within information security will always be challenging. We’re not just looking for technical skills, but also an individual’s experience, their past organisations’ security posture, and how those companies approached processes and program structure.


Inside the race to build an ‘operating system’ for generative AI

The operating-system analogy helps to illustrate the magnitude of the change that generative AI is bringing to enterprises. It is not just about adding a new layer of software tools and frameworks on top of existing systems. It is also about giving the system the authority and agency to run its own process, for example deciding which LLM to use in real time to answer a user’s question, and when to hand off the conversation to a human expert. In other words, an AI managing an AI, according to Intuit’s Srivastava. Finally, it’s about allowing developers to leverage LLMs to rapidly build generative AI applications. This is similar to the way operating systems revolutionized computing by abstracting away the low-level details and enabling users to perform complex tasks with ease. Enterprises need to do the same for generative AI app development. Microsoft CEO Satya Nadella recently compared this transition to the shift from steam engines to electric power. “You couldn’t just put the electric motor where the steam engine was and leave everything else the same, you had to rewire the entire factory,” he told Wired.


A Perfect Wave: Event Driven Business Architecture

In general, in traditional IT data used to be hidden behind fortified castle walls. Access was difficult and the main purpose was to store the data securely. This is changing. Nowadays, modern IT has started to act as a nervous system ensuring that data is made available asap where it is needed, and that it can be used immediately to gain an advantage based on fully up to date information. Let’s have a quick look three customer citations that describe very well why customers move to Event Driven Business Architecture: “We need to move at the speed of business“, Scott, IT, Fortune 500 customer, translating to: everything has become so much faster and we need to be able to support our business; “We want our ERP to be a team player“, Derrick, Fortune 500 customer, translating to: player skills don’t just add up in a team sport, they multiply. This is why your ERP talking to your SuccessFactors talking to your Ariba in real time is so important. It adds lots of value; “It’s a sin“, Alex, Automotive Supplier, translating to: it is a sin not to use your business data. Don’t just hide it and lock it away so that nobody can use it like it is often still done



Quote for the day:

"There is no substitute for knowledge." -- W. Edwards Deming