Daily Tech Digest - October 28, 2022

Why Phishing-Resistant MFA Is on US Government Fast Track

Many government agencies employ some type of MFA. But the Biden administration's guidelines call for all agencies to implement stronger security. While legacy MFA is more secure than using a username and password, it assumes that using a second device and adding a second factor improves security. It's not that simple Most legacy MFA uses a combination of a password and a "something you have" factor. That "something you have" comes into play when implementing the second factor - a one-time code presented by either a physical token, a text message, or an email sent to the user. But adding a secondary device or channel is, at best, much harder to secure and, at worst, impossible to secure. Phishing campaigns can often phish the additional codes or conduct a man-in-the-middle attack on the authentication sequences, as made clear by recent breaches of the companies Uber and Cisco. The biggest issue, however, is that most MFA solutions rely on shared secrets, like passwords, and provide no security context that ties back to the end user and their device.


AI's true goal may no longer be intelligence

To be sure, the question of genuine intelligence does still matter to a handful of thinkers. In the past month, ZDNET has interviewed two prominent scholars who are very much concerned with that question. Yann LeCun, chief AI scientist at Facebook owner Meta Properties, spoke at length with ZDNET about a paper he put out this summer as a kind of think piece on where AI needs to go. LeCun expressed concern that the dominant work of deep learning today, if it simply pursues its present course, will not achieve what he refers to as "true" intelligence, which includes things such as an ability for a computer system to plan a course of action using common sense. LeCun expresses an engineer's concern that without true intelligence, such programs will ultimately prove brittle, meaning, they could break before they ever do what we want them to do. ... The field of AI is undergoing a shift in attitude. It used to be the case that every achievement of an AI program, no matter how good, would be received with the skeptical remark, "Well, but that doesn't mean it's intelligent." 


Building the Future of TensorFlow

We see the growth of TensorFlow not just as an achievement to celebrate, but as an opportunity to go further and deliver more value for the machine learning community. Our goal is to provide the best machine learning platform on the planet. Software that will become a new superpower in the toolbox of every developer. Software that will turn machine learning from a niche craft into an industry as mature as web development. To achieve this, we listen to the needs of our users, anticipate new industry trends, iterate on our APIs, and work to make it increasingly easy for you to innovate at scale. In the same way that TensorFlow originally helped the rise of deep learning, we want to continue to facilitate the evolution of machine learning by giving you the platform that lets you push the boundaries of what's possible. Machine learning is evolving rapidly, and so is TensorFlow. Today, we're excited to announce we've started working on the next iteration of TensorFlow that will enable the next decade of machine learning development. We are building on TensorFlow's class-leading capabilities, and focusing on four pillars.


Europe Prepares to Rewrite the Rules of the Internet

Next week, a law takes effect that will change the internet forever—and make it much more difficult to be a tech giant. On November 1, the European Union’s Digital Markets Act comes into force, starting the clock on a process expected to force Amazon, Google, and Meta to make their platforms more open and interoperable in 2023. That could bring major changes to what people can do with their devices and apps, in a new reminder that Europe has regulated tech companies much more actively than the US. “We expect the consequences to be significant,” says Gerard de Graaf, a veteran EU official who helped pass the DMA early this year. Last month, he became director of a new EU office in San Francisco, established in part to explain the law’s consequences to big tech companies. De Graaf says they will be forced to break open their walled gardens. “If you have an iPhone, you should be able to download apps not just from the App Store [but] from other app stores or from the internet,” de Graaf says, in a conference room with emerald green accents at the Irish consulate in San Francisco where the EU’s office is initially located. 


Data analytics pipeline best practices: Data governance

It's not surprising that all-in-one pipeline automation has become a holy grail for some platform providers. Many enterprises share the same cloud providers, the same department-level SaaSes, and the same types of de facto-standard databases. The clear logic behind an all-in-one platform like Gathr, for example, is that companies will often need the same connectors or "operators," much of the same drag-and-drop machine learning process assembly, and the same sorts of choices between, ETL, ELT and ingestion capabilities. Unifying all this functionality could mean less work for data and analytics teams. But enterprises should remember that the compulsion to subscribe to yet another SaaS extends to these platforms. Engineers in one business unit might gravitate to a Gathr, while others might favor an Alteryx to map together sources a BI platform might need, or a super SaaS like OneSaaS that allows simplified mixing and matching within the OneSaaS environment.


Study Shows Cybersecurity Hype Complicates the Security Stack, Expands the Attack Surface

According to the cybersecurity hype report, confusing marketing strategies by vendors confused most security leaders. Subsequently, 91% of decision-makers found it difficult to select cybersecurity vendors due to unclear marketing about their specific offerings. Additionally, 49% of security leaders said their organization suffers from vendor sprawl, resulting in an increased attack surface. Consequently, 92% of organizations implement a defense-in-depth strategy and have to manage between 10 and 30 different security products. Defense-in-depth aims to create more technological layers to detect, prevent, contain, remediate, and recover from attacks. In a noisy marketplace filled with unsubstantiated claims, users cannot accurately predict the effectiveness of the hyped solutions, nor do they have the time to do so. ...  “Buyers are faced with a crowded and complex market, needing to continually layer new security products into their environment to achieve defense-in-depth, assess new and emerging AI technologies, and continually re-invest in SA&T.”


The Power of Independent Thinking in Leading

The first step in thinking for oneself is self-awareness. When you understand your values, motives, and aspirations, thinking becomes automatic. Knowing your strengths and weaknesses, you can selectively apply the knowledge you gained by reading or the wisdom of others. Thinking for oneself doesn’t mean you ignore all the knowledge you have gained on the subject. Instead, you question what your current knowledge tells you. Cultivate your thinking using mental models, which explain how things work. James Clear, the author of the best-seller, Atomic Habits, describes many mental models in his blog “Mental Models: Learn How to Think Better and Gain a Mental Edge.” One of these mental models is inversion. An example of the application of inversion is to assume your most crucial project has failed six months from now and ask yourself how it could have failed. Such an exercise gives you all the things you need to look out for and plan to mitigate them for the project’s success. Thinking and doing go hand in hand. Put your thinking into action. Take the learning and refine your knowledge.


Keeping the cloud secure with sovereignty in mind

Being able to secure your cloud service supply not only requires data controls, but also access to legal controls. As such, hyperscalers have started adapting how they deploy cloud services to give nation states assurance — essentially meaning that cloud services are deployed in partnership with a local organisation. This has given a rise to sovereign partnerships that license the hyperscaler technology, and are delivered by suppliers under the local legal framework. This pragmatic approach has slowly become more common in recent months, and helps overcome many of the risks associated with using cloud, particularly its assurance of service supply. Despite this, one of the biggest barriers to cloud is the current regulatory landscape surrounding how certain sectors need to control data sovereignty and how that data is securely processed. This often requires a long list of requirements that must be fulfilled to shift services onto the cloud, which is unique for each industry.


The Arguments for Open Source in Mainframes

The arguments for OSS on the mainframe are in many cases the same as for OSS on any other platform -- more accessible, often more secure, easier to develop. “These arguments are from the same development teams who push for OSS elsewhere in the environment,” says Mike Parkin, senior technical engineer at Vulcan Cyber. “The major differences are when the implementation is specific to the mainframe environment.” ... Parkin adds there has been a trend to use mainframe platforms for virtualization, essentially replacing a rack of commodity class servers with a single Big Iron machine that can do the job more efficiently and effectively. “Those are ideal use cases for open-source software at multiple levels, from the guest operating systems to the application layers,” he says. Boris Cipot, senior security engineer at Synopsys Software Integrity Group, a provider of integrated software solutions, agrees that open source can bring fresher and better integrations into today’s working processes and tools, and enable companies to focus on their work and not re-create existing software functionality.


Why We Need A Cyber Intelligence Revolution

Unfortunately, the challenges many organizations face include narrowing down which intelligence sources they’re pulling from, how many can be leveraged at a time, and how they’re integrated into firewalls and other security solutions. No one source of threat intelligence or existing security control can successfully cover the entirety of the threat landscape. It is critical for organizations to deploy threat intelligence from multiple sources, even those that traditionally would compete with one another. These can include commercial providers, open source intelligence data, government agencies and industry sources—all working together to provide organizations with visibility into the traffic affecting their networks. The data is in and the results are clear: What we don't know in the cybersecurity world can hurt us. Thankfully, there are steps your organization—regardless of size—can take to help ensure your network, users and data are protected.



Quote for the day:

"You may be good. You may even be better than everyone esle. But without a coach you will never be as good as you could be." -- Andy Stanley

Daily Tech Digest - October 27, 2022

Network observability: What it means to vendors and to you

Network observability represents an evolution of network monitoring. Network observability solutions should dive deeper into networks, collecting a more diverse and voluminous set of data to give network teams total end-to-end visibility into operations. Those solutions should broaden their scope, looking not just at network performance, but end-user experience, business impacts, and security. Finally, network observability should focus less on tinkering with how it presents data, which ultimately forces network engineers to glean insights themselves and doing too much of the heavy lifting in their heads. Instead, network observability should emphasize actionable insights derived in a variety of ways, including AI and machine learning and low-code scripted automation. The former relies on algorithms to make tools more intelligent. Many vendors are driving toward actionable insights with AIOps, and our research shows that NetOps pros see tremendous potential with these algorithms. 


Resume makeover: Transforming a career post-mortem into a C-suite future

One trap IT leaders often fall into when seeking a new job is viewing their resume as a historical document of their career. The reality is that your resume should paint a clear picture of your career’s future, detailing your past work experience as a roadmap that leads inevitably to your next leadership gig. But striking that balance between detailing the past and mapping toward the future can be challenging, especially while keeping your resume to-the-point. ... As a general rule, a professional resume should be a concise 1-2 pages when applying for corporate roles. Recruiters read through thousands of resumes, so they’re more likely to lose focus or abandon your resume altogether if they can’t get a sense of your qualifications within the first few minutes. ... Including executive summaries and a side bar with your education, skills, and credentials is a great way to remove redundancies from your work experience, allowing you to focus on specific accomplishments at each role, while consolidating your evergreen skills, expertise, and knowledge into short and simple lists.


How to attract more women into software development

Regardless of gender, it boils down to competence and confidence, said Archana. “Building your competence is extremely important, and with that competence comes confidence. Keep learning, build your competence, be confident about yourself, and don’t be worried about too many setbacks,” she added. “When you are a subject matter expert, the agenda is almost invisible at the table because people are listening to you for your expert opinions, for your knowledge in the area. And you want respect from that.” While more could be done to encourage gender diversity, Manjunatha called for women to upskill often. “Keep yourself updated,” added Manjunatha. “Technology is constantly evolving. What got you here is not going to get you there tomorrow, so always keep yourself updated. The growth mindset and that ability to want to keep learning that’s very, very important if you’re in this space. While upskill, e-learning or retraining can be achieved without going through a certification course, Kwong noted that certification is a means to benchmark one’s competency and skillsets.


Australia seeks stiffer penalty for data breaches

Following the update, companies found to have committed the breaches will be fined AU$50 million, or three times the value of any benefit it obtained through the misuse of information, or 30% of the company's adjusted turnover in the relevant period, whichever is greater. The Bill also will afford the Australian Information Commissioner "greater power" to resolve privacy breaches as well as strengthen the Notifiable Data Breaches scheme, which will provide the Commissioner with full knowledge of information that compromised in a breach so it can assess the risks of harm to affected individuals. In addition, the Commissioner as and Australian Communications and Media Authority will be better empowered to share information in the event of a data breach. Dreyfus said: "When Australians are asked to hand over their personal data they have a right to expect it will be protected. Unfortunately, significant privacy breaches in recent weeks have shown existing safeguards are inadequate. 


Does Your Database Really Need to Move to the Cloud?

When it comes to global-scale, multicloud and hybrid use cases, it’s important to consider how you ensure data remains consistent across regions while ensuring applications are running as quickly as possible, Powers added. Redis Enterprise offers Active-Active Geo Distribution, to allow local speed read and writes while ensuring consistent data is replicated across regions, with less than a millisecond of latency. So, even if the long-term goal is full application modernization, Powers said, “There are places where you can still use Oracle or MySQL, and patch us alongside, to fix it in the interim, while you’re making these transitions.” In these cases, he argued, “The modernization is around speed, it’s around scale, it’s around total cost of ownership.” So, the question of how to modernize your database becomes far more nuanced than whether you can afford the time and money to embark on a complete refactoring and re-platforming project.


What challenges are hardest to avoid when managing data in the cloud?

Data is moving to the cloud because it is an excellent place to store, manage, and analyze data. The cloud breaks down information silos that exist in on-premises computing, making it much easier to share data internally and with business partners and customers. However, when you put all your data in one place, you also must implement safeguards that govern the use of the data — most importantly data access control. This has proven to be a challenge for technology vendors and for the organizations that are managing their data in the cloud. The underlying problem is caused by SQL. The industry-standard database query language is a core element of the Modern Data Stack, which is the ecosystem of technologies that enable us to manage data in the cloud. But while SQL is great for business analytics, it cannot support the complex, graph-oriented relationships required for data governance. 


From zero to 10 million lines of Kotlin

Going into this migration, we had two options: We could make it possible to write new code at Meta using Kotlin but leave most of the existing code in Java; We could attempt to convert almost all our in-house code into Kotlin. The advantage of the first option is clear — it’s much less work. But there are two notable disadvantages to this approach. First, enabling interoperability between Kotlin and Java code introduces the use of platform types in Kotlin. Platform types give rise to runtime null pointer dereferences that result in crashes instead of the static safety offered by pure Kotlin code. In some complicated cases, Kotlin’s null check elision can let nulls through and create surprising null pointer exceptions later. This could happen if, for example, Kotlin code calls a Kotlin interface implemented by a Java interface. Other issues include Java’s inability to tag type parameters as nullable (until recently), and Kotlin’s overloading rules taking nullability into account, while Java’s overloading rules do not.


Using Remote Agile Governance to Create the Culture Organisations Need

Agility is not a best practice, but a mindset for uncovering good and better practices. This requires an emergent, context specific good practice of governance. Remote:AF and Esther Derby have spent the last year working to generate a process for just this approach to governance. More and more, organisations are realising that if they truly want to change their culture, they must change their governance. Agility started with the scrum software team, spread through the IT department in the form of DevOps, and through the rest of the organisation in Business Agility - but governance has, until now, been a holdout from this evolution. As long as it remains so, it has the potential to have an out-sized impact, holding the business back from true agility. ... Firstly, let’s define what we mean by governance in the context of this article. Governance is not (just) forums, meetings, and reports. It is all the ways an organisation makes decisions to enact strategy.


Digital transformation: 4 questions to help drive momentum

Enterprises often get locked in a cycle, fixing various aspects of customer experience (CX), operational efficiency, business model innovation, etc. For example, the most commonly cited business driver for digital transformation is customer experience. But while implementing it – through mobility, front-end workflows, and chatbots, for instance – organizations realize that operational efficiency is equally important, if not more so. So they fix the backend, only to realize that they’re missing the bus on the business model. And the cycle continues, making it difficult for enterprises to scale beyond CX use cases in marketing and customer service. ... Addressing technical debt and legacy technologies is a difficult challenge. A well-defined architectural blueprint early on can enable a holistic digital transformation in the long term. It can help identify the best use cases while balancing quick wins with foundational elements. However, if you are in the middle of your journey, it’s essential to tackle the problem of technical debt to move forward.


Why Passkeys Are Better Than Passwords

With passkeys, passwords are simply no longer a threat vector. Passwords account for north of 80% of all security breaches. Passkeys mitigate this threat down to almost nothing. You can’t reuse your passkeys. You don’t have to remember them. They are generated and stored for you, so you don’t have to worry about creating and storing them yourself. You can’t be lured into giving them up because they are unique to a specific website and thus can’t be shared with a phishing website. Sensitive data associated with each passkey never leaves your device. The information is stored on your phone on a special chip (a Trusted Platform Module) that even the NSA might not be able to crack. If you register with a website using a passwordless solution like Passage, that site gets nothing but a public key, which is useless for cracking open your account. While Apple lets you share your account with others via AirDrop, you couldn’t even share the actual private key with a phishing site if you wanted to.



Quote for the day:

"One of the sad truths about leadership is that, the higher up the ladder you travel, the less you know." -- Margaret Heffernan

Daily Tech Digest - October 26, 2022

IT leaders aren't getting listened to, and now they're ready to walk away

Vijay Sundaram, chief strategy officer of Zoho Corporation, said even though IT teams have been "indispensable to business innovation and continuity" in recent years, senior management continue to overlook their input in larger business decisions. This is despite the fact that 88% of respondents believe IT is more responsible for business innovation than ever before, while 85% agree IT could drive even greater innovation in the business if they had a stronger leadership position. Sundaram noted that the role of IT within organizations would become increasingly important as hybrid working and decentralized teams became mainstream. Indeed, 99% of survey respondents said their organization had already moved to a hybrid model. "This will require the expertise and involvement of ITDMs to identify appropriate technologies and meet corporate guidelines in areas like compliance, privacy and security," he added.


How Will AI Technology Change Leadership In The Future?

Data-driven and AI-minded CFOs are already using AI technologies as they predict and report on financial performance, growth plans, fiscal compliance and operating expenses. Even Microsoft Excel spreadsheets are tapping into the power of AI for visualization, dynamic arrays and queries. New AI and machine learning modeling techniques like forecasting, budgeting and investing will shape the nature of finance and its structure. Finance and accounting teams may no longer need accounting clerks to scan invoices and do manual data entry due to intelligent document processing and RPA systems that can automate repetitive tasks. CFOs who incorporate AI in their work are in a powerful position to link predictive analytics with customer behavior. This can result in pricing changes and higher profitability as well as fraud prevention. EY reports that U.S. companies have about $100 billion of bad debt (customers who will be late in paying or will not pay at all) and that a variety of AI tools can be used to remedy that. 


How to Keep Distractions From Hampering IT Staff Performance

It’s important to remember that staff members can't always be totally focused. “Assume there will be times when teams will seek distractions,” Stockall says. Stockall suggests holding fireside chats that allow team members to ask questions within an informal setting. “By stepping out of the formalized communication channels, leaders can hear firsthand what their employees want to learn more about as well as hear direct feedback on how the business can do better.” Jabes believes that when team members are given a set amount of time to complete a task, they're less likely to get sidetracked. .... Breaking down tasks into smaller pieces can make it easier to stay focused on a task, Jabes adds. Enterprises are finally beginning to understand that they need to treat their employees as individuals first, Galperin says. “When employees feel empowered to contribute to the creation of their ideal work environment, they are more motivated and inspired to give the company their best,” she notes. “The right environment will foster focus and high performance.”


How to navigate the current 5G and IoT threat landscape

For IoT devices, Arora recommends the use of networking segmentation and slicing to keep devices segregated from potential threats. He also emphasized the criticality of a differentiated implementation plan, IPS/IDS systems designed to protect IoT devices and their respective networks, and a thorough and periodic risk review. I would also urge companies to routinely patch and update IoT devices, utilize strong password measures and avoid authenticating to company systems or transmitting data over public networks. Where possible, implement device tracking and monitoring, and always utilize an employee check-in and check-out process for handing out IoT devices and returning them. Be sure to confirm terminated employees have no such devices remaining in their possession as well. Any given information set is only going to be as valuable as when it was last released, updated or examined. Threat vectors continually evolve and new risk variants are inevitable, so make sure to subscribe to vendor alerts and newsletters and stay up on the latest developments and terms.


India's Revolutionary ONDC Policy On Hold - Thanks To Data Privacy Issues

The policy is still being developed, but it is anticipated to cover a number of important issues, such as data protection, utilizing platforms with foreign ownership, and supporting domestic e-commerce companies. An important step toward promoting open networks for all facets of trading products and services over digital or electronic networks is the Ministry of Commerce’s recent introduction of ONDC. To the detriment of vendors, the Indian government contends that foreign-funded private businesses Flipkart and Amazon currently control the majority of the country’s e-commerce market. ... On several topics, including worries about security and data privacy, various parties, including online retailers, have been looking for clarification. According to reports, the government is also considering whether to introduce legislation to control non-personal data. The action is being taken as the government works to encourage emerging technologies like artificial intelligence and data analytics and accelerate the expansion of the nation’s digital economy.


New Method Exposes How Artificial Intelligence Works

In a surprising discovery, Jones and his collaborators from Los Alamos, Jacob Springer and Garrett Kenyon, as well as Jones’ mentor Juston Moore, applied their new network similarity metric to adversarially trained neural networks. They discovered that as the severity of the attack increases, adversarial training causes neural networks in the computer vision domain to converge to very similar data representations, regardless of network architecture. “We found that when we train neural networks to be robust against adversarial attacks, they begin to do the same things,” Jones said. There has been an extensive effort in industry and in the academic community searching for the “right architecture” for neural networks, but the Los Alamos team’s findings indicate that the introduction of adversarial training narrows this search space substantially. As a result, the AI research community may not need to spend as much time exploring new architectures, knowing that adversarial training causes diverse architectures to converge to similar solutions.


Postgres is eating relational

Of course, for many enterprise workloads, the people doing the architectures actually aren’t employed by the enterprise but get engaged as consultants. Within the largest global system integrators, there’s that built-in relational experience and, from my conversations with folks in the industry, this tends to be their primary reason for pushing PostgreSQL. During and after the pandemic, there has been huge demand to modernize enterprise infrastructure to make enterprises more agile and responsive to rapidly evolving customer requirements. Those global system integrators take the modernization projects and often apply the technologies that are easiest for them to deploy, netting them the best margins on their services. We can argue about whether this is actually the best thing for customers wanting to modernize, but it’s not hard to understand the underlying logic. Now, if you’re me, working for a document database company, it’s fair to think this apparent overreliance on relational is more due to inertia than a concerted attempt to embrace modern data infrastructure.


Bernd Greifeneder – unifying data for maximised visibility and intelligence

Interestingly, the biggest challenge is not technology, despite building technology that no one has done before. The hardest challenge is always figuring out how to get the right talent in the right areas. This goes back to this need to change the organisation with every doubling in size, which has been key to retaining our entrepreneurial notion. I realised that as we grow towards around 300 people, this doubling in size was relatively easy, because it was the founding team and some of the first employees, and we all have this entrepreneurial attitude and desire to get things done and better than the competition. But then, as you keep hiring and hiring, and the new hires onboard newer hires, suddenly, with 300 people and beyond that becomes really hard. At 500 people, something hit me – the new guys had no clue anymore of who we are, why they come to the office every day in the morning, what motivates them. Not even the mentorship programs we had in place worked. So, I need to make sure that we are explicit. And then it took me a while to figure out how can I make it explicit? 


5 Ways Banks Can Use Blockchain To Improve ESG Efforts

For ESG monitoring purposes, one of blockchain’s primary uses is bringing a bird’s-eye view to supply chain management. A more sustainable, energy-efficient supply chain could deliver profound savings in transportation costs and the concurrent curtailing of carbon emissions that a better managed, more efficient system would bring. With distributed ledger technology, transactions at every step of the supply chain can be recorded and distributed. This brings an unheard-of level of transparency and traceability to the movement of goods around the globe. With automated IoT interfaces, data collection is seamless and less contingent on overworked individuals. This transparency also makes monitoring ethical sourcing in industries that have long presented a challenge to regulators, such as seafood harvesting, more attainable. Products, whether raw or processed, can be tracked early in the production cycle, and the information is available to end users long before items are even delivered—with the journey tracked in real time via blockchain.


Edge and cloud: 4 reasons to adopt both

Edge and cloud computing options each have their unique advantages, and the ideal solution for your team will depend on factors that are pertinent to your industry and organization. It’s essential to weigh the pros and cons carefully and to be mindful of the implications for issues like data security and regulatory compliance, which can vary considerably by industry and the operations you support. CIOs who haven’t been involved in a hybrid computing strategy implementation before may want to consider working with a consultant or managed services provider who has experience with a project of this type. Someone who has handled similar implementations can provide insight and advice to help you realize the full benefits and avoid pitfalls. Remember that a hybrid strategy can allow your organization to achieve performance levels that drive innovation and attract and retain customers while bringing products to market more quickly, conserving resources, and adapting to changing workforce needs. 



Quote for the day:

"It is better to look ahead and prepare than to look back and regret." -- Jackie Joyner-Kersee

Daily Tech Digest - October 25, 2022

Digital-first businesses more willing to accept some fraud

“For new companies, it’s about growth – fraud in that regard can be something like you have a promotion on and people are signing up for multiple accounts. “Digital transformers are also trying to compete, so accessibility, speed, low friction and completion rates rank above compliance. Ten years ago, compliance was higher, but for digital-first companies, user experience concerns are at the top of the agenda.” However, Li stressed that this was not to say that businesses are neglecting their legal compliance obligations, but more that they simply would not survive if they provided an archaic experience that caused friction for the potential customer, such as demanding they send notarised documents through the mail. Indeed, nearly half – 46% – of respondents did feel that their customer onboarding process was still too complex, rising to 55% in the UK. Frequent complaints were that it took too long to review and verify customers when onboarding them, leading to user drop-off, increased costs, and lost revenues.


Enhance Data Analytics with oneDAL

Intel® oneAPI Data Analytics Library (oneDAL) is a library with all the building blocks required to create distributed-data pipelines to transform, process, and model data. complete with all the architectural flexibility of oneAPI. This can be achieved using Intel® Distribution for Python*, C++, or Java APIs that can connect to familiar data sources such as Spark* and Hadoop*. ... oneDAL has tools for transferring out-of-memory data sources, such as databases and text files, into memory for use in analysis, training, or prediction stages. And if the data source cannot fit into memory, the algorithms in oneDAL also support streaming data into memory. Data scientists often spend large amounts of time preparing the data for analysis or machine learning (ML). This includes converting data to numeric representation, adding or removing data, normalizing it, or computing statistics. oneDAL offers algorithms that accelerate these preparation tasks, speeding the turnaround of steps that are often performed interactively.


Google Unveils Its Latest Voice Innovations

Since releasing its first speech patent in 2001, Google has led the way in voice innovation. From interacting with Google Assistant to live captioning in Google Meet, it now boasts an extensive voice suite of tools. Within this are two core innovations: its Speech-to-Text and Text-to-Speech APIs. The Speech-to-Text API supports short and long form speech in over 75 languages and 120+ locales – out-of-the-box – without the need for training and customization. Of course, for some use cases, businesses may demand customization. As such, the API is flexible, allowing users to harness it across various audio channels. It also detects multiple speakers in the same channel, with the solution recognizing their unique voices. ... Moreover, companies can create captions and subtitles for media content or build a virtual agent. Yet, it is also possible to use the technology for speech analysis, summarization, and extraction – each of which has significant potential for contact centers. In tandem, many businesses harness Google’s Text-to-Speech API to communicate with their users. It allows them to take text and synthesize it into audio in a single step.


Why Sensors Are Key IoT Cybersecurity

Sensors enabled by the Internet of Things are network-connected smart devices that collect and transmit real-time data about their environment. The data they provide lets people make better-informed decisions. The use of IoT sensors has grown explosively in recent years because their increasing functionality, small size, and low power consumption allow designers to deploy them in new applications to increase productivity and lower costs. The sensors are being used in new ways to maximize the capabilities of network-connected systems and infrastructure. The sensors are poised for mass-scale adoption in fields including automotive, health care, industrial automation, energy, and smart cities. But the lack of standardization in IoT sensors, coupled with interoperability challenges, has made them vulnerable to cyberattacks—which creates barriers for their ubiquitous use. Hackers are targeting IoT sensors in greater numbers, in more industries, and with increased sophistication.


Transforming Observability

Digital transformation, product and technology leaders see value in observability because of its potential to measure digital experiences and measure the performance of business and digital services. To do this requires observability to meet three significant challenges. First, observability must effectively cross the complex boundaries of microservices, containers, cloud and traditional applications, multiple cloud providers, database sources, SaaS services, infrastructure and internal and external APIs. Today’s challenge is far beyond the central aggregation of large volumes of log data and suppressing non-essential alerts. Most enterprise architectures look eerily similar to a breadboard wiring project with applications, systems and data sources crisscrossing each other, representing the various pathways and interfaces across systems. Virtually any of these elements could contribute to the degradation of a digital experience, and observability must operate across these elements whether they live in our tightly controlled data centers or are distributed in microservices, cloud services or third-party interfaces.


Web 3.0 and the Crowdpoint Constellation

Web 3.0 is about the individual. The underlying technologies that will enable it are personal identification technologies (biometrics), the blockchain and distributed data technology. Let’s not worry about how, right now, let’s just paint the picture. Web 2.0 was all about exploiting data — a great deal of which was your data. The big web businesses mined it to their great enrichment, with the best AI tools known to man. However, it is equally possible for people to band together and mine their collective personal data to their own benefit. This has not yet happened, but the technologies mentioned above make it possible. Now if it were up to the individual to do this on their own initiative, of course, probably nothing would happen. ... If you’ve been tracking the evolution of the blockchain world you will realize that it has evolved a long way beyond the creation and marketing of cryptocurrencies. It is no longer all about speculation. It has stepped boldly into the financial sector, with the creation of services that are commonly described as Open Fi (Open Finance) or De Fi (Decentralized Finance).


Improving finance and accounting software with AI

Starting with audit analytics, auditors tend to spend too much time buried in compliance checklists and creating reports that few people read, with little time to seek anomalies in every transaction. Rather than manually sampling data points, Forrester says machine learning is being used for risk assessment of transactions. The member-based industry association American Institute of Certified Public Accountants (AICPA) is developing guidance for ML in the audit function. Mature audit support providers such as Thomson Reuters and Wolters Kluwer, as well as emerging companies like Caseworks Cloud and MindBridge, are embedding AI into their audit platforms. ... Starting with audit analytics, auditors tend to spend too much time buried in compliance checklists and creating reports that few people read, with little time to seek anomalies in every transaction. Rather than manually sampling data points, Forrester says machine learning is being used for risk assessment of transactions. The member-based industry association American Institute of Certified Public Accountants (AICPA) is developing guidance for ML in the audit function. 


Atlassian Vulnerabilities Highlight Criticality of Cloud Services

The combination of the two flaws could allow a significant attack, says Jake Shafer, a security consultant with Bishop Fox, who found the flaws. "Using the authorization finding would allow a low-privileged user to elevate their role to super admin which, in terms of information disclosure, would allow the attacker to gain access to everything the client of the SaaS had in their Jira deployment," he says. "From there, the attacker could then leverage the SSRF finding to go after the infrastructure of Atlassian themselves." Both vulnerabilities have been patched — the first within a week and the second within a month, according to the disclosure timeline published by Bishop Fox. However, companies should note that the increasing reliance on cloud applications has made attacks on cloud services and workloads much more common, so much so that the top class of vulnerability, according to the Open Web Application Security Project (OWASP), is broken authentication and access-control issues.


When CISOs are doomed to fail, and how to improve your chances of success

Sometimes, CISO candidates can spot a bad employer during the interview process. "You are not only trying to convince them that you are the person they should hire, but you are interviewing them," Callas says. The recruiting process is just like zero-knowledge proof, because neither side wants to be upfront about what is going on. One of Callas's priorities is to learn how much the company cares about security, and he does that by asking direct questions. One time, an executive he talked to admitted that management did not want better protection. A typical question potential CISOs are asked is what they might do in a difficult situation such as a breach. When Callas hears this, he smiles and says: “Has this actually happened?” Sometimes they'll say, 'Oh, no, no, no,' in a way that you know means yes," he adds, "and every so often, you get the person who looks around and says: 'Let me tell you what's really going on.'" Another priority should be understanding to whom the CISO reports: the CEO, the CFO, the CTO, or even the legal department. “[This] tells you a little bit about what they expect you to do," says Chip Gibbons, CISO at Thrive.


Why Functional Programming Should Be The Future Of Software Development

Pure functional programming solves many of our industry’s biggest problems by removing dangerous features from the language, making it harder for developers to shoot themselves in the foot. At first, these limitations may seem drastic, as I’m sure the 1960s developers felt regarding the removal of GOTO. But the fact of the matter is that it’s both liberating and empowering to work in these languages—so much so that nearly all of today’s most popular languages have incorporated functional features, although they remain fundamentally imperative languages. The biggest problem with this hybrid approach is that it still allows developers to ignore the functional aspects of the language. Had we left GOTO as an option 50 years ago, we might still be struggling with spaghetti code today. To reap the full benefits of pure functional programming languages, you can’t compromise. You need to use languages that were designed with these principles from the start. Only by adopting them will you get the many benefits that I’ve outlined here. But functional programming isn’t a bed of roses. It comes at a cost.



Quote for the day:

"Make heroes out of the employees who personify what you want to see in the organization." -- Anita Roddick

Daily Tech Digest - October 21, 2022

3 ways to deter phishing attacks in 2023

Along with being commonplace, phishing attacks have become so profitable (to the attackers) that the biggest cybercriminals have largely moved beyond individual customers. Rather, they target enterprise employees who can be duped into revealing information that’s much more sensitive, on a much larger scale. ... Since phishing attacks overwhelmingly target the human element, cybersecurity experts agree that the best defense against this is providing security awareness training to enterprise employees. This helps in early identification of attacks and increasing overall security hygiene. ... In-house cybersecurity training is no longer a time- and skill-intensive process, given the prevalence of AI-based phishing awareness platforms. Today, ML enables gamified, personalized security training programs for each individual based on their current level of awareness, position in the organization, and browsing behavior. Further, AI is a potent tool in the arms of cybersecurity experts. It enhances the efficiency and effectiveness of security policies by improving and automating routine threat detection procedures.


How to Improve Your OAuth Developer Experience

DX is about providing the best ecosystem for developers to improve business agility. For difficult areas such as deployment, security and incident resolution, ensure that the options you have chosen also work well on local workstations, in addition to meeting your business requirements. You will then build secure and reliable software with simpler code. It is also a gradual journey, and technical goals must be traded against business priorities. It should be possible to articulate all DX objectives in terms of their business value, whether this is faster web development, better data protection or fewer incidents. This will help you to win support for technical initiatives and implement them gradually as part of a technical roadmap. At Curity, we realize the importance of DX to your business. We therefore continually improve our developer resources, including guides for web, mobile and API development. The OAuth family of specifications enables you to implement many security solutions, and we also ensure that our advanced options can be run end-to-end on a development computer.


Financial losses to synthetic identity-based fraud to double by 2024

Despite advances like the US government’s Electronic Consent-Based Verification Service, or eCBSV—which allows some entities to verify whether a given combination of social secuirty number, name and date of birth match an existing Social Security Record—the issue remains a serious one in large part because it’s still easy to create this type of manipulated identity, thanks in part to the lack of incentives for financial institutions to combat them. “Today, the positive value of a good banking relationship is roughly the same as the negative cost of a fraudulent account (roughly $250-$400, depending on the bank or fintech),” the report said. “When considering the downside cost of a fraudulent banking relationship, generally the dollars are very low because financial losses like Peer to Peer, or P2P, scams are absorbed largely by the consumer, and nefarious activities like human and drug trafficking, terrorism, fraudulent PPP [Paycheck Protection Program] and unemployment deposits and low levels of money laundering do not carry a financial loss for the bank.”


Polyglot Microservices Communication Using Dapr on AKS

Distributed Application Runtime (Dapr) is a CNCF project like Kubernetes. It is not just a framework or set of reusable libraries. It is rather a runtime that provides capabilities to solve many common concerns like service discovery, service-to-service communication, state management, configuration, secrets management, and others. Dapr exposes APIs for each of these capabilities which can be invoked from applications using HTTP or gRPC. With this approach, Dapr’s footprint in application code is limited to an API call, allowing Dapr to easily integrate with any language or runtime. The figure below depicts capabilities provided by Dapr, supported infrastructure platforms, and integration options using APIs. ... Dapr can be self-hosted as a process running on an operating system or it can be configured to run as a sidecar on Kubernetes. This allows all kinds of applications, whether they are containerized or not, running on cloud or on-premise or on edge infrastructure, deployed on physical or virtual machines, to make use of its capabilities.


Why Your IT Organization Needs to Embrace Continuous Improvement

Maximum benefits can be realized when an organization not only employs continuous improvement, but measures and proves results to their staff and customers, says Chris Lepotakis, a senior associate at global cybersecurity assessor Schellman. “This provides a greater trust in service and products offered by an organization and fosters higher fidelity between the organization, employees, and customers,” he explains. “Being able to show your customers what you're doing to improve your business, and what it means to their benefit, shows care and transparency on how the organization has recognized and improved on weak points.” It also proves that the organization is always looking for ways to provide continued value and trust, Lepotakis adds. Organizations looking to develop a continuous improvement culture should begin by creating a framework to support delivery resources. Such a framework should include processes for identifying, assessing, and implementing changes, as well as metrics to measure service quality, McIntire advises. 


Using blockchain to grow your start-up

Public blockchain is highly secure and practically impossible to counterfeit due to the underlying cryptography used. While companies may attempt to manipulate private blockchain records, an independent auditor can monitor mischief in the network. Blockchain provides the trust and security needed for multiple organisations to connect on the shared ledger. This is because all users have access to a copy of the whole blockchain, meaning they can see if any meddling is going on. If there’s a hash match throughout the chain, the records are trustworthy. Furthermore, smart contracts represent the future of transactions and are interwoven with blockchain. Smart contracts are a piece of code that can outline each step of a transaction, with the ability to connect multiple blockchains and assets. When terms of the contract are met, they are automatically initiated. For start-ups, these coded contracts can revolutionise the way they conduct business. For example, smart contracts are perfectly suited for supply chain management.


Digital transformation: How manufacturers align maintenance and production

IT leaders are in the best position to offer ways for solving an age-old dilemma. For starters, a digital system alone can significantly improve scheduling processes to maximize the time for value-adding work. Teams at the forefront of maintenance can relate to how unplanned priorities can easily throw them off and put valuable wrench time at risk. Moreover, the downtime between work order execution, such as obtaining and preparing parts, provides opportunities to make better use of waiting time. Quantifying the sources of non-productive tasks can present ways to minimize them or, at least, present options for more efficient multi-tasking. In addition to the digitization of work schedules, a more digitally mature operation enables a facility to challenge the need for maintenance in the first place. Preventive maintenance activities conventionally require arbitrary criteria for frequencies and routines. These habits arise from a lack of better choices rather than purposeful measures. 


Building stakeholder trust in measures of sustainable performance

Regulators increasingly feel that some ESG metrics are useful indicators of good governance—that companies should know what their carbon footprint is and should ensure that they have a diverse and equitable workforce. Beyond that, companies still have a lot of wiggle room to determine what’s material, depending on the issue, the context, the time frame, and the stakeholder. That opens the door to subjectivity in definitions that can render reporting ambiguous and difficult for investors and others to interpret. Even regulations, where they exist, are usually quite broad. Their emphasis on disclosing the risks that are most important—the most material—to their stakeholders still leaves it to companies to decide what is material and who are their stakeholders. Many companies still prefer a traditional, quantifiable definition of materiality that elevates shareholders and enterprise value over a more sustainable vision that includes all stakeholders and the company’s impact on society and the environment.


DevOps Burnout? Try Platform Engineering

If cognitive load is the root of the problem, what is the solution? For many organizations, the key is platform engineering, designing and building toolchains and workflows that enable self-service capabilities for software engineering organizations in the cloud native era. Platform engineers build what is often called an internal developer platform, which covers the operational necessities of the entire life cycle of an application. Platform engineering tries to enable true DevOps by following a Platform as a Product approach to strike the right balance between maintaining developer freedom and finding the right level of abstraction. Platform teams pave paths of least resistance, called golden paths, for developers using the platform, drive standardization by design and connect various parts of the toolchain together to build a coherent and improved developer experience. This enables self-service capabilities for the organization while abstracting away the unnecessary complexity that contributes to cognitive load. Successful platforms result in less repetitive and manual work. They enable developers to do their jobs without having to learn all technologies in depth.


Reimagining the new future of work: The CXO’s transformation conundrum

Reimagining work begins with understanding the work that needs to get done in context of where an organization aims to be in the future and how that vision aligns with the evolution of behaviors, expectations, and preferences of customers and employees. Just because some leaders are striving for normalcy, people can’t unsee the freedoms, conveniences, and empowerment that come with connectedness. In fact, Salesforce research found that not only is the world forever changed, 76% of workers do not feel prepared for working in a digital-first world. In a separate Salesforce study, 54% of workers believe technology will advance faster than the skills of the workforce. The only way to bridge the gap is to assess the existing divide within the organization between current skills, skills in development, and skills needed in the next five-to-ten years. The future of work is now about balance and inspired by empathy outside of the mindsets still holding onto pre-2020 ideologies.



Quote for the day:

"If your actions inspire others to dream more, learn more, do more and become more, you are a leader." -- John Quincy Adams