Daily Tech Digest - March 13, 2023

CFO Cybersecurity Strategies: How to Protect Against the Rising Storm

Think of cybersecurity as an investment in resiliency. Taking a comprehensive approach to cybersecurity increases the odds that your organization will not only identify malicious activity and successfully deflect attackers, but also respond effectively and recover with minimal impact if a worst-case scenario unfolds. However, you need to proactively validate that your company’s approach is truly comprehensive. Historically, cybersecurity has assumed the purview of IT, while the reality of cybersecurity is much more complex and pervasive. While IT can manage and solve many risks, every leader in an organization has a role to play, from governance, legal, compliance, public relations, human resources, etc. So does every third party including your vendors, suppliers, contractors, service providers, and customers. So, it’s not only about technology, but people and processes as well. Simply put, cybersecurity is like a tree with a complex root system.


The problem with development speed

Less code, but more impact. That’s the formula for success. But it’s not what many development teams do. For too many, as Gilad details, “a product group that has two-thirds of the output really [can] create four times the impact.” The key, he stresses, is that “most of what we create is a waste, [so] chasing output is actually creating more waste, faster.” All of which sounds great, but telling developers to “do more good things and fewer bad things,” is hardly actionable. The trick, Gilad outlines, is to introduce more research and testing earlier in the development process, coupled with a willingness to scrap incomplete projects that aren’t on track for success. It’s not that developers will sit around thinking about success but not shipping. Rather, “you should increase throughput, but not of launches.” Instead, focus on running more “tests and experiments.” By doing so, you’ll end up with fewer projects but ones with a higher impact. This willingness to shed bad code early can make a huge difference.


Tapping AI to Alleviate Cloud Spend

Vladimirskiy says CIOs and executives overseeing an organization’s IT strategy are responsible for evaluating and implementing effective AI-based cloud optimization solutions. Because the efficacy of an AI-based cloud optimization system is based on how well-trained the model responsible for managing the corresponding workload is, it’s not advisable for organizations to start from scratch. “Vendors who focus specifically on this type of optimization will have access to more in-depth data across multiple organizations to train these models and ultimately create successful AI cloud optimization solutions,” he says. Diaz agrees the key stakeholders when it comes to implementing AI to manage cloud spending and control costs are primarily IT management, but finance plays a key role. ... Finance is involved as the final stop when it comes to paying for cloud resources, controlling what portion of the organization’s budget goes into both the cloud resources, and the AI technology used to help manage the cloud.


Contract-Driven Development – A Real-World Adoption Journey

In our search for an alternate solution, we wanted to try contract-driven development because it seemed to satisfy our initial criteria.Parallel development of provider and consumer applications API specification is the API contract (instead of two separate artifacts). An automated technique other than a code generation-based technique to ensure providers were adhering to contract. More emphasis on API design and promoting collaboration among teams in the process However, the teams were also skeptical about contract-driven development because this again involved API specifications and contract testing, both of which they had already tried and had seen a low return on investment. However, addressing these concerns was a great starting point for us to get the teams started on contract-driven development. We felt the most convincing way of achieving this would be through a real-world example in their context and taking it to production. ... To gain more buy-in, we set out to introduce contract-driven development to just a handful of teams working on a feature that cut across two or three microservices and frontend components.


The importance of measurement in closing the sustainability gap

Good engineering practice, such as edge-caching, optimised data storage, reusability and code efficiency can almost always have a positive impact on sustainability. Applications that require less compute power use less electricity, which ultimately leads to a net reduction of CO2-like emissions. It is important to take these factors into account when choosing architectural options and following green engineering best practices. The gains may be small at the level of the developer but become clearly significant when scaled-up to production levels. Quantitative measurement is essential to evidence that improvement. Sustainability has also become part of DevOps vocabulary as GreenOps, focusing on improving continuous integration and delivery from the perspective of reducing emissions. A critical part of this role is adding sustainability reports to existing dashboarding approaches, giving organisations a real-time window on that closing sustainability gap. The key is managing customer and organisational objectives throughout, and treating sustainability like a transformation programme. 


Fighting financial fraud through fusion centers

The boundaries between cybersecurity and fraud/financial crime have been blurred in recent years. Indeed, cyberattacks on financial services are often the first stage of fraud taking place. Take common attacks like phishing or account takeovers for example. Are these cyber-attacks, fraud, or both? And fraud isn’t always an immediate process; some fraudulent schemes are going on for years. Who has responsibility for what, and when? The truth is that cyber-attacks and fraud are now too closely linked to be considered separately. But many firms still have investigative fraud teams and cybersecurity teams operating independently, along with the systems and processes that support them. As a result, these teams have different levels of access to various data repositories, and do not necessarily use the same toolsets to analyze them. That data is arriving at fluctuating speeds, in multiple formats, and in huge volumes. Some firms may have to navigate a complex legacy technology environment to access that data. In short, there is no consistent context within which a unified decision can be made.


Schneider Electric CIO talks IT staffing, sustainability, and digital transformation

“All workers, including IT workers, must have a connection to their company’s mission, and ownership over what their company’s goals and values are. At Schneider Electric, it is important that IT workers understand what we do as a business, in addition to our overarching mission of creating and offering solutions to help our customers. This attitude creates awareness, as well as dedication to their role within IT and the broader company.” “As for the specific traits of these workers, one that is learned is what we call the power couple model — a domain and digital leader — when the business leaders and technology leaders complement each other by playing different roles in solutioning. The domain, or business leaders, are responsible for the 'what' and the 'why,' while the digital leaders are responsible for the 'how' and the 'when.' They do this through leveraging new technology to offer the most efficient solutions to customers and create a beneficial partnership.”


Tech purchasing decisions – how many involved?

The key to good decision-making is having the department that best understands how the technology will be used involved, but also ensuring that the leaders who really understand technology are in the room as well. That means understanding, from the beginning, who needs to be involved in the decision and ensuring they are in the conversation. For those selling the technology, whether through brand awareness or lead generation campaigns, it means realising that targeting a single decision-maker who works in the specific function where their technology is used is the wrong strategy. Yes, those selling, for example, martech need to have the marketing function on board, but they also need buy-in from IT, procurement, sales, finance and HR. ... The business world is becoming increasingly interconnected, meaning the impacts of the decisions leaders make are far more wide-reaching. Whether companies are considering their sustainability strategy, navigating supply-chain risk or purchasing a new CRM system, leaders increasingly need to understand what is happening outside their function and how it impacts them. 


Can AI solve IT’s eternal data problem?

Most enterprises today maintain a vast expanse of data stores, each one associated with its own applications and use cases—a proliferation that cloud computing has exacerbated, as business units quickly spin up cloud applications with their own data silos. Some of those data stores may be used for transactions or other operational activities, while others (mainly data warehouses) serve those engaged in analytics or business intelligence. To further complicate matters, “every organization on the planet has more than two dozen data management tools,” says Noel Yuhanna, a VP and principal analyst at Forrester Research. “None of those tools talk to each other.” These tools handle everything from data cataloging to MDM (master data management) to data governance to data observability and more. Some vendors have infused their wares with AI/ML capabilities, while others have yet to do so. At a basic level, the primary purpose of data integration is to map the schema of various data sources so that different systems can share, sync, and/or enrich data. The latter is a must-have for developing a 360-degree view of customers, for example.


Navigating Your Data Science Career

Similar to finding your passion as a data scientist, finding new opportunities to diversify skill sets and experience is extremely helpful when trying to grow your career. There are many business sectors that require data science. Many of my retail coworkers have gone on to have great careers in media, finance, supply chain, social platforms, banking, and many other industries. Having a diverse background can open many more opportunities in the future. Not only can having a robust background be more attractive to recruiters, it can also be helpful in case a market downturn occurs in a given business sector which may limit future career opportunities. Although exploring different data science fields can be beneficial, as a developing data scientist you often have many opportunities to expand your skill set within your current company. Take retail, for example; data science expertise is required in sectors such as marketing, pricing, logistics, and merchandising. Being open to new positions provides the opportunity to gain new industry knowledge and become a more valuable and well-rounded employee.



Quote for the day:

"Leadership is intangible, and therefore no weapon ever designed can replace it." -- Omar N. Bradley

No comments:

Post a Comment