Daily Tech Digest - October 31, 2022

Vaadin CEO: Developers are the architects of the future

Developer passion for a project is the best barometer of its utility and potential. Developers love that Vaadin provides components and tools that make it faster and easier to build modern web apps with a great UX. End users in the enterprise now expect an intuitive and pleasing user experience, just like they are used to in their personal lives as consumers. I’m excited that we are making it easy for developers to deliver a compelling UX for the Java-based applications powering the enterprise. ... Last year, the competition for tech talent was fierce, and many companies learned the hard way that if you’re not intentional about your culture, your talent can quickly find a new workplace that provides something that better meets their culture needs. Part of that is investing in understanding what developers do, understanding the technologies they use, listening to pain points, and smoothing the day-to-day path so they can do what they do best. You need to offer flexibility—not just in terms of work/life balance and autonomy, but also maintaining an agile enough environment to account for new tool preferences and process efficiencies as determined by the developers.

Raising the resilience of your organization

Rather than continually tell teams what to do, leaders in resilient organizations minimize bureaucracies and foster entrepreneurship among and within teams. They nearly always put decision making in the hands of small cross-functional teams, as far from the center and as close to the customer as possible. They clarify the team’s and the organization’s purpose, provide some guardrails, and ensure accountability and alignment—but then they step back and let employees take the lead. The Disney theme parks provide a good example: every employee is dubbed a cast member, and their clear objective is to create “amazing guest experiences” within a set of guardrails that includes, among other responsibilities, ensuring visitor safety and fostering family-friendly behaviors. ... Another characteristic of resilient organizations is their ability to break down silos and use “tiger teams” to tackle big business problems. These are groups of experts from various parts of an organization who come together temporarily to focus on a specific issue and then, once the issue is addressed, go back to their respective domains.

Government ups cyber support for elderly, vulnerable web users

The Department for Digital, Culture, Media and Sport (DCMS) said research had shown many people struggle to engage and benefit from the range of digital media literacy education that is available for reasons such as limited experience or lack of confidence in going online, lack of awareness of how to access such education, and lack of availability of same. It created the Media Literacy Taskforce Fund earlier this year as one of two funding schemes pitched at targeting hard-to-reach or vulnerable groups through community-led projects. The other scheme, the Media Literacy Programme Fund, is set to deliver training courses, online learning, tech solutions and mentoring schemes to vulnerable web users. Grant recipients from the first fund include: Fresherb, a social enterprise working with young people to develop podcasts – aired on local radio stations – that explore issues around online dis- and misinformation; Internet Matters, a Manchester-based charity providing media literacy training for care workers and school leavers

Dell, AMD, IBM, and Strangeworks Dig into Quantum’s Future

Besides simply getting quantum computers to work, there are many questions about what financial applications will be able to actually do better on quantum systems than classical resources. Optimization is often touted, but perhaps wrongly so, said IBM’s Prabhakar. “There are basically three areas which are interesting: there’s simulating nature, there is optimization, and then there is machine learning,” said Prabhakar. “Initially, when we started working with banks, right – JPMC is a client of ours as is Goldman Sachs and others – they started looking at optimization. But it was very clear that there are classical methods which are actually advancing as quickly in optimization. So, if you’d asked me three years ago, my answer would have been optimization. Now we are realizing it’s not optimization, it’s probably machine learning-based analysis, which are going to have first early use cases. “If you look at the work our clients are doing, you will see that a lot of them are actually not making the decision right now on which of these three areas they want to work on.

Tech in turmoil: Talent disruption in India’s IT sector and the ‘M’ word

The issue of ‘moonlighting’ has been doing the rounds of late. It got wider attention after Wipro chief Rishad Premji equated it to cheating. Several other companies have also raised their concern, with a few of them even sacking employees for moonlighting. Wipro recently fired 300 employees for moonlighting. Additionally, it has announced to open offices four days a week with employees needing to attend office physically at least three days in a week. This was done to adopt a flexible approach to make teams experience and build meaningful relationships at work. CN Ashwath Narayan, the IT minister of Karnataka, asked those who moonlight to leave the state, saying freelancing beyond office hours is “literally cheating”. IT and tech giant IBM, too, sent a strong note to its employees over moonlighting. In a note to employees, Sandip Patel, India and South Asia head of IBM, wrote: “A second job could be full time, part time or contractual in nature but at its core is a failure to comply with employment obligations and a potential conflict of interest with IBM’s interest.”

Cyber Ratings as Measures of Digital Trust

Companies have added cyber reputation management practices to their cybersecurity organizations to manage these public cyber ratings. Several firms provide services in the security rating category, each with its own models and algorithms. Cybersecurity teams subscribe to one or more of these services to manage the data used in such ratings. They also subscribe to monitor their ever-growing list of third parties and look for weaknesses that might bring about cyber incidents. Proxy advisors use these same rating services to supplement financial data in annual proxy statements. Other use cases include cyber insurance underwriters looking for evidence-based reasons to reject applications and assist clients in improving their control environments. At the core, such practices increase trust in the digital economy. Credit ratings are an apt model for implementing trust models. Like corporate credit ratings, they can be done with or without involvement from the rated entity. They can also be made available to the public (like a security rating). In-depth analyses can also be done and shared privately (think of this as a pen test or security assessment).

Introduction to Cloud Custodian

By automating a lot of the tedious policy management away, Cloud Custodian could reduce risk and accidents through more streamlined cloud governance. “It solves the natural problems when infrastructure is in everyone’s head,” says Thangavelu. By aggregating ad-hoc scripts and unifying policies across an organization, you could immediately instigate new rules without manually reminding all members of an organization, which could take years. For those familiar with Open Policy Agent (OPA), you may notice some overlap in the objectives, as both are engines for enacting cloud-native policies. Compared to OPA, Cloud Custodian has some developer experience perks. For one, you don’t have to use Rego, as the policies are written in YAML, which is a familiar configuration language for DevOps engineers. Cloud Custodian uses abstractions on event runtimes for each cloud provider. Furthermore, compared to OPA, you don’t need to bind the engine for a particular problem domain, says Thangavelu, as Cloud Custodian is specifically bounded to cloud governance and management.

Data Security Takes Front Seat In Industrial IoT Design

Even apparently mundane applications, such as smart home utility meters, are targets of opportunity for thieves looking to steal power from the grid. Thankfully, data breaches have appeared in the headlines often enough that customers are awakening to the need for security as a core component of their technology solutions. Increasingly, my group is engaging earlier in the design process to help customers better understand how to adequately provision and scale their devices with a combination of hardware-accelerated cryptography, secure key storage, and some form of physical protection. ... It is important to understand which tools to use and when, starting with the basic building blocks and moving on to complete solutions. It’s the equivalent of buying a four-digit combination lock from the hardware store. They all come set to “zero,” and we help customers find the best way to program the lock. As the world’s devices, systems, appliances and IIoT networks adopt more technology layers, it’s crucial to ensure that, as these building blocks are assembled, their attack surface is as small as possible.

The Challenges of Ethical Data Stewardship

A real challenge for businesses is how to manage third-party data. It often has no control over how that data was gathered or how its partners use the data it provides. Another challenge that is often overlooked is data that comes as part of an acquisition of a company. How do you extend ethical controls to that? Langhorne believes that Precisely is aligned in thinking about these different aspects, needs and requirements. Although she is new to the company, she sees Precisely’s privacy journey continuing to mature. In the short term, she says, “we have principles that we follow and compliance expectations. In considering the suppliers of our data, we do due diligence. We also have contracts with obligations, and we are very concerned about the quality of our data because that is part of our brand promise. It is about data integrity, and that means having data you can trust. These same principles extend to our M&A activity, an area we have a lot of experience in having acquired seven businesses in the past two years! 

Technology Feeds Sustainable Agriculture

For now, Ganesan says, the biggest problem is the lack of a holistic loop to address sustainability. While farmers are open to innovation, particularly through technology, they often have difficulty embracing sustainability because their suppliers -- which sell seeds, chemicals, and other products -- are slow to adopt lower carbon practices. “We’re seeing progress but it’s still a bit of the wild, wild west,” he says. “In many cases, there’s a lack of coordination within industries and by governments.” There are some encouraging signs, however. For instance, in February 2022, the United States Department of Agriculture announced that it is investing US $1 billion in companies in order to reduce greenhouse gas emissions and fuel innovation related to climate-based technologies. Ganesan believes that the agricultural industry has only scratched the surface about what’s possible with precision technology and analytics. “A key is to develop technologies that not only drive improvements but also create value for farmers. This will accelerate sustainability,” he notes.

Quote for the day:

"Don't be buffaloed by experts and elites. Experts often possess more data than judgement." -- Colin Powell

Daily Tech Digest - October 30, 2022

FCA examining big tech disruption of financial services

Warnings of financial services sector competition being harmed have been expressed by the UK regulator, as Amazon, Apple, Google and Meta look to continue innovating in the industry, reports the Financial Times. The FCA will ask the corporations — which all hold FCA permits for payment processing in the UK — for their perspectives on how Silicon Valley could expand into payments, deposits, credit and insurance. All four companies hold payment action permits, with Amazon and Apple also having some permissions regarding consumer credit and insurance. While the watchdog acknowledges that big tech involvement of financial services would bring “increased efficiency” and “healthy competition” in the short term, it states that this could lead to longer term exploitation of ecosystems and data stores, to “lock consumers in”. The body has also suggested that tech companies generally should share customer data with traditional financial service institutions.

The ins and outs of migrating SQL Server to the cloud

A homogeneous migration between an on-prem version of SQL Server and the RDS equivalent can actually be relatively easy, says Ayodele. The only change necessary is an alteration of the system schema. RDS has built-in stored procedures for management purposes that are not in the on-prem SQL Server engine. So, customers can simply migrate the database itself to avoid corrupting the RDS system schema. The next step will be to use native tools or AWS Database Migration Service (AWS DMS) to port the data across from the source to the destination. With AWS DMS, the source database remains operational during this process to minimize downtime. DMS can use change data capture (CDC) technology to keep track of ongoing changes in the source database during migration. Once the migration is done, the final steps will be to run the RDS version as a replica and then switch over to the RDS primary database instance when ready. There are some best practices that customers should follow when migrating, Ayodele says. 

Big 3 Public Cloud Providers Highlight Cost Control Capabilities

"The long-term trends that are driving cloud adoption continue to play an even stronger role during uncertain macroeconomic times," Pichai said. "As companies globally are looking to drive efficiencies, Google Cloud's open infrastructure creates a valuable pathway to reduce IT costs and modernize." Using the cloud to do more with less was also a theme echoed by Microsoft CEO Satya Nadella during his company's earnings call. Moving to the cloud helps organizations align their spend with demand and mitigate risk around increasing energy costs and supply chain constraints, Nadella said. Microsoft is also very optimistic about the growth of hybrid cloud in addition to public cloud services. Nadella said that Microsoft now has more than 8,500 customers for its Azure Arc technology, which is more than double the number a year ago. "We're also seeing more customers turn to us to build and innovate with infrastructure they already have," he said. "With Azure Arc, organizations like Wells Fargo can run Azure services, including containerized applications across on-premises, edge, and multicloud environments."

Delivering visibility requires a new approach for SecOps

The biggest challenge most organizations face when operationalizing these frameworks is the fact that the data/information they need is siloed across multiple data systems and cybersecurity tools. Security data lives in multiple places, with organizations using a variety of data logging systems like Splunk, Snowflake, or other data lakes as the foundation for threat hunting and research. The security operations center (SOC) will layer on platforms for Security Information and Event Management (SIEM), Extended Detection and Response (XDR), and other tools on top of these data lakes to help analyze data and correlate events (e.g., Crowdstrike Falcon Data Replicator or email security systems, such as Proofpoint or Tessian). Security analysts can spend hours exporting data from these systems, tagging and normalizing the information, and ingesting that data into their SIEMs and SOARs before they can begin to detect, hunt, triage, and respond to threats. But next-gen tools are being developed that address this exact issue with automation and machine-learning.

The growing role of audit in ESG information integrity and assurance

Like audits of financial statements and the internal control over financial reporting, third-party assurance enhances the reliability of ESG information and builds confidence among stakeholders. To do this, auditors conduct attestation engagements to provide assurance that ESG information is presented in accordance with certain criteria. “We help management and the board feel confident in the reported ESG information, which is important given the increased focus and attention from external stakeholders,” explains Whittaker. More specifically, ESG assurance obtained from a certified public accountant, “involves the evaluation of processes, systems, and data, as appropriate, and then assessing the findings in order to support an opinion based on an examination [reasonable assurance] or conclusion based on a review [limited assurance],” according to the Center for Audit Quality. Because companies are at different stages of their sustainability journeys, the breadth of ESG assurance engagements is vast.

If you’re going to build something from scratch, this might be as good a time as in a decade

the environment for launching a start-up was really crazy the past five years. And the truth is that if you’re going to build something from scratch, this might be as good a time as you’ve had in a decade. Real estate? You can get all the real estate you want. People used to fret about lease cost, but that’s all gone. And while people get caught up on whether the money’s cheap or not, getting rid of the distraction of all that cheap money may be a good thing. That whole mentality of, oh, your competitor raised $100 million, now you have to raise $100 million. All those things have evaporated—for the better, I’d say. A huge thing is that your access to talent is way better. It was so hard to get, but now it’s a lot cheaper than it was. There are layoffs happening. And then hybrid has opened up the people you can get. I’ve heard some pretty amazing stories. Jennifer Tejada, who runs PagerDuty, says they went into the pandemic at 85 percent Bay Area employees and came out at 25 percent. 

Good Governance: 9 Principles to Set Your Organization up for Success

Good corporate governance requires that records and processes are transparent and available to shareholders and stakeholders. Financial records should not be inflated or exaggerated. Reporting should be presented to shareholders and stakeholders in ways that enable them to understand and interpret the findings. Transparency means that stakeholders should be informed of key corporate contacts and told who can answer questions and explain reports, if necessary. Corporations should provide enough information in their reports so that readers get a complete view of the issues. ... All too often, the corporate world’s focus can be taken up by sudden crises and controversies. A timely response to the unexpected is crucial, with corporations that practice good governance usually able to prioritize swift and honest communication with shareholders and stakeholders. ... Many corporations also consider the environmental impact as they perform their duties and responsibilities. 

How a Marketing Tool is Becoming the Healthcare Industry’s Security Nightmare

“Consumer activity tracking for the purpose of marketing is not a fit for the health sector,” Mike Hamilton, CISO of cybersecurity firm Critical Insight and former CISO for the city of Seattle, argues. “Because of regulatory oversight by the [US] Department of Health and Human Services, as well as the privacy statutes coming out of states, like the California Consumer Privacy Act, this is not information that is germane to the health sector mission, and its possession creates significant liability.” ... “This could have been prevented by the use of other analytic tools to understand patient usage rather than a marketing technique that is designed to gather and share so much information that is outside the scope of the intended purpose.” “At least dozens of the nation's top hospitals use tracking pixels for millions of patients. That may be changing fast due to new laws and lawsuits that will force organizations to change course drastically,” Paul Innella, CEO of TDI, a global cybersecurity company in the banking and healthcare spaces, tells InformationWeek.

Cranefly Cyberspy Group Spawns Unique ISS Technique

ISS logs record data such as webpages visited and apps used. The Cranefly attackers are sending commands to a compromised Web server by disguising them as Web access requests; IIS logs them as normal traffic, but the dropper can read them as commands, if they contain the strings Wrde, Exco, or Cllo, which don't normally appear in IIS log files. "These appear to be used for malicious HTTP request parsing by Geppei — the presence of these strings prompts the dropper to carry out activity on a machine," Gorman notes. "It is a very stealthy way for attackers to send these commands." The commands contain malicious encoded .ashx files, and these files are saved to an arbitrary folder determined by the command parameter and they run as backdoors (i.e., ReGeorg or Danfuan). Gorman explains that the technique of reading commands from IIS logs could in theory be used to deliver different types of malware if leveraged by threat actors with different goals.

Mitigating the risks of artificial intelligence compromise

The fundamental actions required from any security approach is to protect, detect, attest, and recover from any modifications to coding, whether malicious or otherwise. The best way to fully secure a compromised AI is applying a “trusted computing” model that covers all four AI elements. Starting with the data set aspect of a system, a component such as a Trusted Platform Module (TPM) is able to sign and verify that any data provided to the machine has been communicated from a reliable source. A TPM can ensure the safeguarding of any algorithms used within an AI system. The TPM provides hardened storage for platform or software keys. These keys can then be used to protect and attest the algorithms. Furthermore, any deviations of the model, if bad or inaccurate data is supplied, can be prevented through applying trusted principles focusing on cyber resiliency, network security, sensor attestation, and identity.

Quote for the day:

"The very essence of leadership is that you have to have vision. You can't blow an uncertain trumpet." -- Theodore Hesburgh

Daily Tech Digest - October 29, 2022

7 ways to ruin your IT leadership reputation

Be mindful of the decisions you make. “One careless choice can ruin your reputation and your career,” warns Jim Durham, CIO of Solar Panels Network USA, a national solar panel installation company. “By being aware of the risks and taking responsibility for your actions, you can minimize the damage and learn from your mistakes,” he advises. A careless decision can be anything from selecting the wrong technology to mishandling sensitive data. “Not only are these actions career-destructive, but they can also have lasting negative effects on your enterprise,” Durham notes. CIOs are always pressured by management to make the right decision. It’s important to remember, however, that even the best strategies and intentions can sometimes lead to disastrous results. “If you’re unsure about a decision, it’s always better to err on the side of caution and consult with your team before making a final call,” Durham suggests. Failure is never an option, particularly major failures. “It shows that you’re not capable of handling important tasks,” Durham states. 

Cyber Skills Shortage is Caused by Analyst Burnout

Data shows skilled and experienced professionals are leaving the industry due to burnout and disillusionment. In the UK, the cybersecurity workforce reportedly shrank by 65,000 last year, and according to a recent study, one in three current cybersecurity professionals are planning to change professions. According to ISACA’s State of Cybersecurity 2022 report, the top reasons for cybersecurity professionals leaving include being recruited by other companies (59%), poor financial incentives (48%), limited promotion and development opportunities (47%), high levels of work-related stress (45%) and lack of management support (34%). When discussing the skills shortage, many, by default, think of businesses struggling to recruit for their internal cybersecurity vacancies. However, this is equally challenging for specialist providers of consulting and managed cybersecurity services. Businesses increasingly rely on third-party managed services, particularly mid-size organizations, where outsourcing to a Managed Security Service Provider (MSSP) represents a much more commercially viable solution with considerably less up-front investment.

Data privacy is expensive — here’s how to manage costs

“The true cost of data privacy, broadly, is their trust with their customers,” said Akbar Mohammed, lead data scientist, Fractal AI. “In this era of customers increasingly becoming tech-savvy, as soon as they realize that their data isn’t secure, the company will risk loss of trust from consumers. This eventually results in a lot of business disruption.” Almost all companies that need to collect data for their operations should have a data privacy infrastructure in place. Companies should also set up dedicated security and compliance teams surveying data and technology assets along with maintaining an aggressive threat detection policy. It’s imperative for companies today to have a data strategy and have policy and procedures governed by a data governance entity. “For large organizations, it’s best to have regular audits or assessments and get privacy-related certifications,” Mohammad said. “Lastly, train your people and make the entire organization aware of your activities, your policies.”

Architectural Patterns for Microservices With Kubernetes

Kubernetes provides many constructs and abstractions to support service and application Deployment. While applications differ, there are foundational concepts that help drive a well-defined microservices deployment strategy. Well-designed microservices deployment patterns play into an often-overlooked Kubernetes strength. Kubernetes is independent of runtime environments. Runtime environments include Kubernetes clusters running on cloud providers, in-house, bare metal, virtual machines, and developer workstations. When Kubernetes Deployments are designed properly, deploying to each of these and other environments is accomplished with the same exact configuration. In grasping the platform independence offered by Kubernetes, developing and testing the deployment of microservices can begin with the development team and evolve through to production. Each iteration contributes to the overall deployment pattern. A production deployment definition is no different than a developer's workstation configuration. 

High data quality key to reducing supply chain disruption

With so many obstacles to overcome, the supply chain needs a saviour – and many experts are pointing to big data to fill the role. Prince believes data will become more important in this new era. He says that after Brexit, “there is uniquely new importance placed on master data, given the customs and other regulatory impacts of moving goods between the two markets”. Also, the greater risks posed in global trade and the need to be resilient mean that the predictive capabilities of data could be crucial. ... The potential of big data is clear – but to get the best results, the data involved needs to be accurate. “Data quality takes on many forms, including accuracy, completeness, timeliness, precision, and granularity,” says Laney. He points out that most organisations don’t have n-tier visibility in their supply chain, which means they don’t understand what is happening beyond the first tier of suppliers in the chain. They may also have incomplete data on where items are in the supply chain or when disruptions will happen.

Privacy assembly in Istanbul calls for adaptation to new necessities

Explaining that new challenges and needs emerged with the development of artificial intelligence (AI) and the metaverse, Koç underlined that protecting personal data should be a requirement. "Unfortunately, we pay the price for the comfort and efficiency provided by technology in the age of data, with privacy," he said. KVKK Chair Faruk Bilir, for his part, said that since the foundation's membership to the assembly, Türkiye has given significant importance to international efforts in the field. KVKK leads initiatives for the protection and awareness of personal data and privacy in line with the laws and regulations adopted since 2016, he added. "The protection of individuals' privacy emerges as an unchanging fact of the changing world," Bilir said. The protection of privacy is an indicator of civilization, Bilir said underlining the importance of a human-oriented approach. Law and ethics are complementary elements to the human-oriented approach, he added. "Technology is indispensable for us, our privacy is our priority," Bilir said.

CISA Releases Performance Goals for Critical Infrastructure

Among the newly recommended measures are implementation of multifactor authentication, making sure to revoke the login credentials of former employees, disabling Microsoft Office macros and prohibiting the connection of unauthorized devices, perhaps by disabling AutoRun. The document also recommends that the operational technology side have a single leader responsible for cybersecurity and that OT and IT staff work to improve their relationship. Organizations should "sponsor at least one 'pizza party' or equivalent social gathering per year" to be attended by the two cybersecurity teams. DHS says it will actively solicit feedback about the goals in the coming months and has set up a GitHub discussions page. The department's next plan is to roll out cybersecurity goals tailored to each sector of critical infrastructure in conjunction with the agencies closest to each sector, such as the Environmental Protection Agency for water systems.

Will Twitter Sink or Swim Under Elon Musk's Direction?

Musk's accompanying "let that sink in" tweet could be, in terms of bang for the buck, the most groan-inducing dad joke of all time. But it shouldn't hide the very real business and security challenges facing Musk, who's already CEO of Tesla and SpaceX, as he takes the helm of a social network sporting 230 million customers. "The bird is freed," Musk tweeted late Thursday, before the $44 billion deal closed Friday, and Twitter filed for delisting from the New York Stock Exchange as it goes private. Like so much with Musk, commentators have been attempting to deduce his planned intentions on numerous fronts. Musk styles himself as a showman, having once tweeted - apparently about nothing in particular - that "the most entertaining outcome is the most likely." ... What state Twitter might be in once Musk is done with it remains unclear. Then again, when you're the richest person in the world, what's a few billion here or there, especially if it keeps people talking about you and guessing at your next move?

How to turbocharge collaboration in innovation ecosystems

Whether you call it socialization or use any other term, the human dimension of innovation is often overlooked or obscured. In part, this is because technology and the covid-19–induced migration to online platforms have garnered a great deal of attention. It’s important to remind managers that innovating as a special form of problem-solving is best tackled by empowering the workforce. Collaboration can be jump-started from many directions, but it can be only as vibrant as the company’s underlying culture of curiosity, learning, and continuous adaptation. In the Veezoo–AXA story, the formal process failed to reach a breakthrough. It was the involvement of specific individuals who were keen to see the collaboration through—often on their own terms—that led to success in building an innovation ecosystem. In fact, it is often through the behaviors and work of certain people that effective structure and discipline emerge across an ecosystem. 

Data Quality as the Centerpiece of Data Mesh

After all, data quality is always context-dependent and the domain teams will best know the business context of the data. From a data quality perspective, data mesh makes good sense as it allows data quality to be defined in a context-specific way–for example, the same data point can be considered “good” for one team but “bad” for another, depending on the context. As an example, let’s take a subscription price column with a fair amount of anomalies in it. Team A is working on cost optimization while Team B is working on churn prediction. As such, price anomalies will be more of an important data quality issue for Team B than for Team A. To make it easier to facilitate ownership between data products (which can be database tables, views, streams, CSV files, visualization dashboards, etc.), the data mesh framework suggests each product should have a Service Level Objective. This will act as a data contract, to establish and enforce the quality of the data it provides: timeliness, error rates, data types, etc.

Quote for the day:

"Humility is a great quality of leadership which derives respect and not just fear or hatred." -- Yousef Munayyer

Daily Tech Digest - October 28, 2022

Why Phishing-Resistant MFA Is on US Government Fast Track

Many government agencies employ some type of MFA. But the Biden administration's guidelines call for all agencies to implement stronger security. While legacy MFA is more secure than using a username and password, it assumes that using a second device and adding a second factor improves security. It's not that simple Most legacy MFA uses a combination of a password and a "something you have" factor. That "something you have" comes into play when implementing the second factor - a one-time code presented by either a physical token, a text message, or an email sent to the user. But adding a secondary device or channel is, at best, much harder to secure and, at worst, impossible to secure. Phishing campaigns can often phish the additional codes or conduct a man-in-the-middle attack on the authentication sequences, as made clear by recent breaches of the companies Uber and Cisco. The biggest issue, however, is that most MFA solutions rely on shared secrets, like passwords, and provide no security context that ties back to the end user and their device.

AI's true goal may no longer be intelligence

To be sure, the question of genuine intelligence does still matter to a handful of thinkers. In the past month, ZDNET has interviewed two prominent scholars who are very much concerned with that question. Yann LeCun, chief AI scientist at Facebook owner Meta Properties, spoke at length with ZDNET about a paper he put out this summer as a kind of think piece on where AI needs to go. LeCun expressed concern that the dominant work of deep learning today, if it simply pursues its present course, will not achieve what he refers to as "true" intelligence, which includes things such as an ability for a computer system to plan a course of action using common sense. LeCun expresses an engineer's concern that without true intelligence, such programs will ultimately prove brittle, meaning, they could break before they ever do what we want them to do. ... The field of AI is undergoing a shift in attitude. It used to be the case that every achievement of an AI program, no matter how good, would be received with the skeptical remark, "Well, but that doesn't mean it's intelligent." 

Building the Future of TensorFlow

We see the growth of TensorFlow not just as an achievement to celebrate, but as an opportunity to go further and deliver more value for the machine learning community. Our goal is to provide the best machine learning platform on the planet. Software that will become a new superpower in the toolbox of every developer. Software that will turn machine learning from a niche craft into an industry as mature as web development. To achieve this, we listen to the needs of our users, anticipate new industry trends, iterate on our APIs, and work to make it increasingly easy for you to innovate at scale. In the same way that TensorFlow originally helped the rise of deep learning, we want to continue to facilitate the evolution of machine learning by giving you the platform that lets you push the boundaries of what's possible. Machine learning is evolving rapidly, and so is TensorFlow. Today, we're excited to announce we've started working on the next iteration of TensorFlow that will enable the next decade of machine learning development. We are building on TensorFlow's class-leading capabilities, and focusing on four pillars.

Europe Prepares to Rewrite the Rules of the Internet

Next week, a law takes effect that will change the internet forever—and make it much more difficult to be a tech giant. On November 1, the European Union’s Digital Markets Act comes into force, starting the clock on a process expected to force Amazon, Google, and Meta to make their platforms more open and interoperable in 2023. That could bring major changes to what people can do with their devices and apps, in a new reminder that Europe has regulated tech companies much more actively than the US. “We expect the consequences to be significant,” says Gerard de Graaf, a veteran EU official who helped pass the DMA early this year. Last month, he became director of a new EU office in San Francisco, established in part to explain the law’s consequences to big tech companies. De Graaf says they will be forced to break open their walled gardens. “If you have an iPhone, you should be able to download apps not just from the App Store [but] from other app stores or from the internet,” de Graaf says, in a conference room with emerald green accents at the Irish consulate in San Francisco where the EU’s office is initially located. 

Data analytics pipeline best practices: Data governance

It's not surprising that all-in-one pipeline automation has become a holy grail for some platform providers. Many enterprises share the same cloud providers, the same department-level SaaSes, and the same types of de facto-standard databases. The clear logic behind an all-in-one platform like Gathr, for example, is that companies will often need the same connectors or "operators," much of the same drag-and-drop machine learning process assembly, and the same sorts of choices between, ETL, ELT and ingestion capabilities. Unifying all this functionality could mean less work for data and analytics teams. But enterprises should remember that the compulsion to subscribe to yet another SaaS extends to these platforms. Engineers in one business unit might gravitate to a Gathr, while others might favor an Alteryx to map together sources a BI platform might need, or a super SaaS like OneSaaS that allows simplified mixing and matching within the OneSaaS environment.

Study Shows Cybersecurity Hype Complicates the Security Stack, Expands the Attack Surface

According to the cybersecurity hype report, confusing marketing strategies by vendors confused most security leaders. Subsequently, 91% of decision-makers found it difficult to select cybersecurity vendors due to unclear marketing about their specific offerings. Additionally, 49% of security leaders said their organization suffers from vendor sprawl, resulting in an increased attack surface. Consequently, 92% of organizations implement a defense-in-depth strategy and have to manage between 10 and 30 different security products. Defense-in-depth aims to create more technological layers to detect, prevent, contain, remediate, and recover from attacks. In a noisy marketplace filled with unsubstantiated claims, users cannot accurately predict the effectiveness of the hyped solutions, nor do they have the time to do so. ...  “Buyers are faced with a crowded and complex market, needing to continually layer new security products into their environment to achieve defense-in-depth, assess new and emerging AI technologies, and continually re-invest in SA&T.”

The Power of Independent Thinking in Leading

The first step in thinking for oneself is self-awareness. When you understand your values, motives, and aspirations, thinking becomes automatic. Knowing your strengths and weaknesses, you can selectively apply the knowledge you gained by reading or the wisdom of others. Thinking for oneself doesn’t mean you ignore all the knowledge you have gained on the subject. Instead, you question what your current knowledge tells you. Cultivate your thinking using mental models, which explain how things work. James Clear, the author of the best-seller, Atomic Habits, describes many mental models in his blog “Mental Models: Learn How to Think Better and Gain a Mental Edge.” One of these mental models is inversion. An example of the application of inversion is to assume your most crucial project has failed six months from now and ask yourself how it could have failed. Such an exercise gives you all the things you need to look out for and plan to mitigate them for the project’s success. Thinking and doing go hand in hand. Put your thinking into action. Take the learning and refine your knowledge.

Keeping the cloud secure with sovereignty in mind

Being able to secure your cloud service supply not only requires data controls, but also access to legal controls. As such, hyperscalers have started adapting how they deploy cloud services to give nation states assurance — essentially meaning that cloud services are deployed in partnership with a local organisation. This has given a rise to sovereign partnerships that license the hyperscaler technology, and are delivered by suppliers under the local legal framework. This pragmatic approach has slowly become more common in recent months, and helps overcome many of the risks associated with using cloud, particularly its assurance of service supply. Despite this, one of the biggest barriers to cloud is the current regulatory landscape surrounding how certain sectors need to control data sovereignty and how that data is securely processed. This often requires a long list of requirements that must be fulfilled to shift services onto the cloud, which is unique for each industry.

The Arguments for Open Source in Mainframes

The arguments for OSS on the mainframe are in many cases the same as for OSS on any other platform -- more accessible, often more secure, easier to develop. “These arguments are from the same development teams who push for OSS elsewhere in the environment,” says Mike Parkin, senior technical engineer at Vulcan Cyber. “The major differences are when the implementation is specific to the mainframe environment.” ... Parkin adds there has been a trend to use mainframe platforms for virtualization, essentially replacing a rack of commodity class servers with a single Big Iron machine that can do the job more efficiently and effectively. “Those are ideal use cases for open-source software at multiple levels, from the guest operating systems to the application layers,” he says. Boris Cipot, senior security engineer at Synopsys Software Integrity Group, a provider of integrated software solutions, agrees that open source can bring fresher and better integrations into today’s working processes and tools, and enable companies to focus on their work and not re-create existing software functionality.

Why We Need A Cyber Intelligence Revolution

Unfortunately, the challenges many organizations face include narrowing down which intelligence sources they’re pulling from, how many can be leveraged at a time, and how they’re integrated into firewalls and other security solutions. No one source of threat intelligence or existing security control can successfully cover the entirety of the threat landscape. It is critical for organizations to deploy threat intelligence from multiple sources, even those that traditionally would compete with one another. These can include commercial providers, open source intelligence data, government agencies and industry sources—all working together to provide organizations with visibility into the traffic affecting their networks. The data is in and the results are clear: What we don't know in the cybersecurity world can hurt us. Thankfully, there are steps your organization—regardless of size—can take to help ensure your network, users and data are protected.

Quote for the day:

"You may be good. You may even be better than everyone esle. But without a coach you will never be as good as you could be." -- Andy Stanley

Daily Tech Digest - October 27, 2022

Network observability: What it means to vendors and to you

Network observability represents an evolution of network monitoring. Network observability solutions should dive deeper into networks, collecting a more diverse and voluminous set of data to give network teams total end-to-end visibility into operations. Those solutions should broaden their scope, looking not just at network performance, but end-user experience, business impacts, and security. Finally, network observability should focus less on tinkering with how it presents data, which ultimately forces network engineers to glean insights themselves and doing too much of the heavy lifting in their heads. Instead, network observability should emphasize actionable insights derived in a variety of ways, including AI and machine learning and low-code scripted automation. The former relies on algorithms to make tools more intelligent. Many vendors are driving toward actionable insights with AIOps, and our research shows that NetOps pros see tremendous potential with these algorithms. 

Resume makeover: Transforming a career post-mortem into a C-suite future

One trap IT leaders often fall into when seeking a new job is viewing their resume as a historical document of their career. The reality is that your resume should paint a clear picture of your career’s future, detailing your past work experience as a roadmap that leads inevitably to your next leadership gig. But striking that balance between detailing the past and mapping toward the future can be challenging, especially while keeping your resume to-the-point. ... As a general rule, a professional resume should be a concise 1-2 pages when applying for corporate roles. Recruiters read through thousands of resumes, so they’re more likely to lose focus or abandon your resume altogether if they can’t get a sense of your qualifications within the first few minutes. ... Including executive summaries and a side bar with your education, skills, and credentials is a great way to remove redundancies from your work experience, allowing you to focus on specific accomplishments at each role, while consolidating your evergreen skills, expertise, and knowledge into short and simple lists.

How to attract more women into software development

Regardless of gender, it boils down to competence and confidence, said Archana. “Building your competence is extremely important, and with that competence comes confidence. Keep learning, build your competence, be confident about yourself, and don’t be worried about too many setbacks,” she added. “When you are a subject matter expert, the agenda is almost invisible at the table because people are listening to you for your expert opinions, for your knowledge in the area. And you want respect from that.” While more could be done to encourage gender diversity, Manjunatha called for women to upskill often. “Keep yourself updated,” added Manjunatha. “Technology is constantly evolving. What got you here is not going to get you there tomorrow, so always keep yourself updated. The growth mindset and that ability to want to keep learning that’s very, very important if you’re in this space. While upskill, e-learning or retraining can be achieved without going through a certification course, Kwong noted that certification is a means to benchmark one’s competency and skillsets.

Australia seeks stiffer penalty for data breaches

Following the update, companies found to have committed the breaches will be fined AU$50 million, or three times the value of any benefit it obtained through the misuse of information, or 30% of the company's adjusted turnover in the relevant period, whichever is greater. The Bill also will afford the Australian Information Commissioner "greater power" to resolve privacy breaches as well as strengthen the Notifiable Data Breaches scheme, which will provide the Commissioner with full knowledge of information that compromised in a breach so it can assess the risks of harm to affected individuals. In addition, the Commissioner as and Australian Communications and Media Authority will be better empowered to share information in the event of a data breach. Dreyfus said: "When Australians are asked to hand over their personal data they have a right to expect it will be protected. Unfortunately, significant privacy breaches in recent weeks have shown existing safeguards are inadequate. 

Does Your Database Really Need to Move to the Cloud?

When it comes to global-scale, multicloud and hybrid use cases, it’s important to consider how you ensure data remains consistent across regions while ensuring applications are running as quickly as possible, Powers added. Redis Enterprise offers Active-Active Geo Distribution, to allow local speed read and writes while ensuring consistent data is replicated across regions, with less than a millisecond of latency. So, even if the long-term goal is full application modernization, Powers said, “There are places where you can still use Oracle or MySQL, and patch us alongside, to fix it in the interim, while you’re making these transitions.” In these cases, he argued, “The modernization is around speed, it’s around scale, it’s around total cost of ownership.” So, the question of how to modernize your database becomes far more nuanced than whether you can afford the time and money to embark on a complete refactoring and re-platforming project.

What challenges are hardest to avoid when managing data in the cloud?

Data is moving to the cloud because it is an excellent place to store, manage, and analyze data. The cloud breaks down information silos that exist in on-premises computing, making it much easier to share data internally and with business partners and customers. However, when you put all your data in one place, you also must implement safeguards that govern the use of the data — most importantly data access control. This has proven to be a challenge for technology vendors and for the organizations that are managing their data in the cloud. The underlying problem is caused by SQL. The industry-standard database query language is a core element of the Modern Data Stack, which is the ecosystem of technologies that enable us to manage data in the cloud. But while SQL is great for business analytics, it cannot support the complex, graph-oriented relationships required for data governance. 

From zero to 10 million lines of Kotlin

Going into this migration, we had two options: We could make it possible to write new code at Meta using Kotlin but leave most of the existing code in Java; We could attempt to convert almost all our in-house code into Kotlin. The advantage of the first option is clear — it’s much less work. But there are two notable disadvantages to this approach. First, enabling interoperability between Kotlin and Java code introduces the use of platform types in Kotlin. Platform types give rise to runtime null pointer dereferences that result in crashes instead of the static safety offered by pure Kotlin code. In some complicated cases, Kotlin’s null check elision can let nulls through and create surprising null pointer exceptions later. This could happen if, for example, Kotlin code calls a Kotlin interface implemented by a Java interface. Other issues include Java’s inability to tag type parameters as nullable (until recently), and Kotlin’s overloading rules taking nullability into account, while Java’s overloading rules do not.

Using Remote Agile Governance to Create the Culture Organisations Need

Agility is not a best practice, but a mindset for uncovering good and better practices. This requires an emergent, context specific good practice of governance. Remote:AF and Esther Derby have spent the last year working to generate a process for just this approach to governance. More and more, organisations are realising that if they truly want to change their culture, they must change their governance. Agility started with the scrum software team, spread through the IT department in the form of DevOps, and through the rest of the organisation in Business Agility - but governance has, until now, been a holdout from this evolution. As long as it remains so, it has the potential to have an out-sized impact, holding the business back from true agility. ... Firstly, let’s define what we mean by governance in the context of this article. Governance is not (just) forums, meetings, and reports. It is all the ways an organisation makes decisions to enact strategy.

Digital transformation: 4 questions to help drive momentum

Enterprises often get locked in a cycle, fixing various aspects of customer experience (CX), operational efficiency, business model innovation, etc. For example, the most commonly cited business driver for digital transformation is customer experience. But while implementing it – through mobility, front-end workflows, and chatbots, for instance – organizations realize that operational efficiency is equally important, if not more so. So they fix the backend, only to realize that they’re missing the bus on the business model. And the cycle continues, making it difficult for enterprises to scale beyond CX use cases in marketing and customer service. ... Addressing technical debt and legacy technologies is a difficult challenge. A well-defined architectural blueprint early on can enable a holistic digital transformation in the long term. It can help identify the best use cases while balancing quick wins with foundational elements. However, if you are in the middle of your journey, it’s essential to tackle the problem of technical debt to move forward.

Why Passkeys Are Better Than Passwords

With passkeys, passwords are simply no longer a threat vector. Passwords account for north of 80% of all security breaches. Passkeys mitigate this threat down to almost nothing. You can’t reuse your passkeys. You don’t have to remember them. They are generated and stored for you, so you don’t have to worry about creating and storing them yourself. You can’t be lured into giving them up because they are unique to a specific website and thus can’t be shared with a phishing website. Sensitive data associated with each passkey never leaves your device. The information is stored on your phone on a special chip (a Trusted Platform Module) that even the NSA might not be able to crack. If you register with a website using a passwordless solution like Passage, that site gets nothing but a public key, which is useless for cracking open your account. While Apple lets you share your account with others via AirDrop, you couldn’t even share the actual private key with a phishing site if you wanted to.

Quote for the day:

"One of the sad truths about leadership is that, the higher up the ladder you travel, the less you know." -- Margaret Heffernan

Daily Tech Digest - October 26, 2022

IT leaders aren't getting listened to, and now they're ready to walk away

Vijay Sundaram, chief strategy officer of Zoho Corporation, said even though IT teams have been "indispensable to business innovation and continuity" in recent years, senior management continue to overlook their input in larger business decisions. This is despite the fact that 88% of respondents believe IT is more responsible for business innovation than ever before, while 85% agree IT could drive even greater innovation in the business if they had a stronger leadership position. Sundaram noted that the role of IT within organizations would become increasingly important as hybrid working and decentralized teams became mainstream. Indeed, 99% of survey respondents said their organization had already moved to a hybrid model. "This will require the expertise and involvement of ITDMs to identify appropriate technologies and meet corporate guidelines in areas like compliance, privacy and security," he added.

How Will AI Technology Change Leadership In The Future?

Data-driven and AI-minded CFOs are already using AI technologies as they predict and report on financial performance, growth plans, fiscal compliance and operating expenses. Even Microsoft Excel spreadsheets are tapping into the power of AI for visualization, dynamic arrays and queries. New AI and machine learning modeling techniques like forecasting, budgeting and investing will shape the nature of finance and its structure. Finance and accounting teams may no longer need accounting clerks to scan invoices and do manual data entry due to intelligent document processing and RPA systems that can automate repetitive tasks. CFOs who incorporate AI in their work are in a powerful position to link predictive analytics with customer behavior. This can result in pricing changes and higher profitability as well as fraud prevention. EY reports that U.S. companies have about $100 billion of bad debt (customers who will be late in paying or will not pay at all) and that a variety of AI tools can be used to remedy that. 

How to Keep Distractions From Hampering IT Staff Performance

It’s important to remember that staff members can't always be totally focused. “Assume there will be times when teams will seek distractions,” Stockall says. Stockall suggests holding fireside chats that allow team members to ask questions within an informal setting. “By stepping out of the formalized communication channels, leaders can hear firsthand what their employees want to learn more about as well as hear direct feedback on how the business can do better.” Jabes believes that when team members are given a set amount of time to complete a task, they're less likely to get sidetracked. .... Breaking down tasks into smaller pieces can make it easier to stay focused on a task, Jabes adds. Enterprises are finally beginning to understand that they need to treat their employees as individuals first, Galperin says. “When employees feel empowered to contribute to the creation of their ideal work environment, they are more motivated and inspired to give the company their best,” she notes. “The right environment will foster focus and high performance.”

How to navigate the current 5G and IoT threat landscape

For IoT devices, Arora recommends the use of networking segmentation and slicing to keep devices segregated from potential threats. He also emphasized the criticality of a differentiated implementation plan, IPS/IDS systems designed to protect IoT devices and their respective networks, and a thorough and periodic risk review. I would also urge companies to routinely patch and update IoT devices, utilize strong password measures and avoid authenticating to company systems or transmitting data over public networks. Where possible, implement device tracking and monitoring, and always utilize an employee check-in and check-out process for handing out IoT devices and returning them. Be sure to confirm terminated employees have no such devices remaining in their possession as well. Any given information set is only going to be as valuable as when it was last released, updated or examined. Threat vectors continually evolve and new risk variants are inevitable, so make sure to subscribe to vendor alerts and newsletters and stay up on the latest developments and terms.

India's Revolutionary ONDC Policy On Hold - Thanks To Data Privacy Issues

The policy is still being developed, but it is anticipated to cover a number of important issues, such as data protection, utilizing platforms with foreign ownership, and supporting domestic e-commerce companies. An important step toward promoting open networks for all facets of trading products and services over digital or electronic networks is the Ministry of Commerce’s recent introduction of ONDC. To the detriment of vendors, the Indian government contends that foreign-funded private businesses Flipkart and Amazon currently control the majority of the country’s e-commerce market. ... On several topics, including worries about security and data privacy, various parties, including online retailers, have been looking for clarification. According to reports, the government is also considering whether to introduce legislation to control non-personal data. The action is being taken as the government works to encourage emerging technologies like artificial intelligence and data analytics and accelerate the expansion of the nation’s digital economy.

New Method Exposes How Artificial Intelligence Works

In a surprising discovery, Jones and his collaborators from Los Alamos, Jacob Springer and Garrett Kenyon, as well as Jones’ mentor Juston Moore, applied their new network similarity metric to adversarially trained neural networks. They discovered that as the severity of the attack increases, adversarial training causes neural networks in the computer vision domain to converge to very similar data representations, regardless of network architecture. “We found that when we train neural networks to be robust against adversarial attacks, they begin to do the same things,” Jones said. There has been an extensive effort in industry and in the academic community searching for the “right architecture” for neural networks, but the Los Alamos team’s findings indicate that the introduction of adversarial training narrows this search space substantially. As a result, the AI research community may not need to spend as much time exploring new architectures, knowing that adversarial training causes diverse architectures to converge to similar solutions.

Postgres is eating relational

Of course, for many enterprise workloads, the people doing the architectures actually aren’t employed by the enterprise but get engaged as consultants. Within the largest global system integrators, there’s that built-in relational experience and, from my conversations with folks in the industry, this tends to be their primary reason for pushing PostgreSQL. During and after the pandemic, there has been huge demand to modernize enterprise infrastructure to make enterprises more agile and responsive to rapidly evolving customer requirements. Those global system integrators take the modernization projects and often apply the technologies that are easiest for them to deploy, netting them the best margins on their services. We can argue about whether this is actually the best thing for customers wanting to modernize, but it’s not hard to understand the underlying logic. Now, if you’re me, working for a document database company, it’s fair to think this apparent overreliance on relational is more due to inertia than a concerted attempt to embrace modern data infrastructure.

Bernd Greifeneder – unifying data for maximised visibility and intelligence

Interestingly, the biggest challenge is not technology, despite building technology that no one has done before. The hardest challenge is always figuring out how to get the right talent in the right areas. This goes back to this need to change the organisation with every doubling in size, which has been key to retaining our entrepreneurial notion. I realised that as we grow towards around 300 people, this doubling in size was relatively easy, because it was the founding team and some of the first employees, and we all have this entrepreneurial attitude and desire to get things done and better than the competition. But then, as you keep hiring and hiring, and the new hires onboard newer hires, suddenly, with 300 people and beyond that becomes really hard. At 500 people, something hit me – the new guys had no clue anymore of who we are, why they come to the office every day in the morning, what motivates them. Not even the mentorship programs we had in place worked. So, I need to make sure that we are explicit. And then it took me a while to figure out how can I make it explicit? 

5 Ways Banks Can Use Blockchain To Improve ESG Efforts

For ESG monitoring purposes, one of blockchain’s primary uses is bringing a bird’s-eye view to supply chain management. A more sustainable, energy-efficient supply chain could deliver profound savings in transportation costs and the concurrent curtailing of carbon emissions that a better managed, more efficient system would bring. With distributed ledger technology, transactions at every step of the supply chain can be recorded and distributed. This brings an unheard-of level of transparency and traceability to the movement of goods around the globe. With automated IoT interfaces, data collection is seamless and less contingent on overworked individuals. This transparency also makes monitoring ethical sourcing in industries that have long presented a challenge to regulators, such as seafood harvesting, more attainable. Products, whether raw or processed, can be tracked early in the production cycle, and the information is available to end users long before items are even delivered—with the journey tracked in real time via blockchain.

Edge and cloud: 4 reasons to adopt both

Edge and cloud computing options each have their unique advantages, and the ideal solution for your team will depend on factors that are pertinent to your industry and organization. It’s essential to weigh the pros and cons carefully and to be mindful of the implications for issues like data security and regulatory compliance, which can vary considerably by industry and the operations you support. CIOs who haven’t been involved in a hybrid computing strategy implementation before may want to consider working with a consultant or managed services provider who has experience with a project of this type. Someone who has handled similar implementations can provide insight and advice to help you realize the full benefits and avoid pitfalls. Remember that a hybrid strategy can allow your organization to achieve performance levels that drive innovation and attract and retain customers while bringing products to market more quickly, conserving resources, and adapting to changing workforce needs. 

Quote for the day:

"It is better to look ahead and prepare than to look back and regret." -- Jackie Joyner-Kersee