Daily Tech Digest - October 29, 2022

7 ways to ruin your IT leadership reputation

Be mindful of the decisions you make. “One careless choice can ruin your reputation and your career,” warns Jim Durham, CIO of Solar Panels Network USA, a national solar panel installation company. “By being aware of the risks and taking responsibility for your actions, you can minimize the damage and learn from your mistakes,” he advises. A careless decision can be anything from selecting the wrong technology to mishandling sensitive data. “Not only are these actions career-destructive, but they can also have lasting negative effects on your enterprise,” Durham notes. CIOs are always pressured by management to make the right decision. It’s important to remember, however, that even the best strategies and intentions can sometimes lead to disastrous results. “If you’re unsure about a decision, it’s always better to err on the side of caution and consult with your team before making a final call,” Durham suggests. Failure is never an option, particularly major failures. “It shows that you’re not capable of handling important tasks,” Durham states. 


Cyber Skills Shortage is Caused by Analyst Burnout

Data shows skilled and experienced professionals are leaving the industry due to burnout and disillusionment. In the UK, the cybersecurity workforce reportedly shrank by 65,000 last year, and according to a recent study, one in three current cybersecurity professionals are planning to change professions. According to ISACA’s State of Cybersecurity 2022 report, the top reasons for cybersecurity professionals leaving include being recruited by other companies (59%), poor financial incentives (48%), limited promotion and development opportunities (47%), high levels of work-related stress (45%) and lack of management support (34%). When discussing the skills shortage, many, by default, think of businesses struggling to recruit for their internal cybersecurity vacancies. However, this is equally challenging for specialist providers of consulting and managed cybersecurity services. Businesses increasingly rely on third-party managed services, particularly mid-size organizations, where outsourcing to a Managed Security Service Provider (MSSP) represents a much more commercially viable solution with considerably less up-front investment.


Data privacy is expensive — here’s how to manage costs

“The true cost of data privacy, broadly, is their trust with their customers,” said Akbar Mohammed, lead data scientist, Fractal AI. “In this era of customers increasingly becoming tech-savvy, as soon as they realize that their data isn’t secure, the company will risk loss of trust from consumers. This eventually results in a lot of business disruption.” Almost all companies that need to collect data for their operations should have a data privacy infrastructure in place. Companies should also set up dedicated security and compliance teams surveying data and technology assets along with maintaining an aggressive threat detection policy. It’s imperative for companies today to have a data strategy and have policy and procedures governed by a data governance entity. “For large organizations, it’s best to have regular audits or assessments and get privacy-related certifications,” Mohammad said. “Lastly, train your people and make the entire organization aware of your activities, your policies.”


Architectural Patterns for Microservices With Kubernetes

Kubernetes provides many constructs and abstractions to support service and application Deployment. While applications differ, there are foundational concepts that help drive a well-defined microservices deployment strategy. Well-designed microservices deployment patterns play into an often-overlooked Kubernetes strength. Kubernetes is independent of runtime environments. Runtime environments include Kubernetes clusters running on cloud providers, in-house, bare metal, virtual machines, and developer workstations. When Kubernetes Deployments are designed properly, deploying to each of these and other environments is accomplished with the same exact configuration. In grasping the platform independence offered by Kubernetes, developing and testing the deployment of microservices can begin with the development team and evolve through to production. Each iteration contributes to the overall deployment pattern. A production deployment definition is no different than a developer's workstation configuration. 


High data quality key to reducing supply chain disruption

With so many obstacles to overcome, the supply chain needs a saviour – and many experts are pointing to big data to fill the role. Prince believes data will become more important in this new era. He says that after Brexit, “there is uniquely new importance placed on master data, given the customs and other regulatory impacts of moving goods between the two markets”. Also, the greater risks posed in global trade and the need to be resilient mean that the predictive capabilities of data could be crucial. ... The potential of big data is clear – but to get the best results, the data involved needs to be accurate. “Data quality takes on many forms, including accuracy, completeness, timeliness, precision, and granularity,” says Laney. He points out that most organisations don’t have n-tier visibility in their supply chain, which means they don’t understand what is happening beyond the first tier of suppliers in the chain. They may also have incomplete data on where items are in the supply chain or when disruptions will happen.


Privacy assembly in Istanbul calls for adaptation to new necessities

Explaining that new challenges and needs emerged with the development of artificial intelligence (AI) and the metaverse, Koç underlined that protecting personal data should be a requirement. "Unfortunately, we pay the price for the comfort and efficiency provided by technology in the age of data, with privacy," he said. KVKK Chair Faruk Bilir, for his part, said that since the foundation's membership to the assembly, Türkiye has given significant importance to international efforts in the field. KVKK leads initiatives for the protection and awareness of personal data and privacy in line with the laws and regulations adopted since 2016, he added. "The protection of individuals' privacy emerges as an unchanging fact of the changing world," Bilir said. The protection of privacy is an indicator of civilization, Bilir said underlining the importance of a human-oriented approach. Law and ethics are complementary elements to the human-oriented approach, he added. "Technology is indispensable for us, our privacy is our priority," Bilir said.


CISA Releases Performance Goals for Critical Infrastructure

Among the newly recommended measures are implementation of multifactor authentication, making sure to revoke the login credentials of former employees, disabling Microsoft Office macros and prohibiting the connection of unauthorized devices, perhaps by disabling AutoRun. The document also recommends that the operational technology side have a single leader responsible for cybersecurity and that OT and IT staff work to improve their relationship. Organizations should "sponsor at least one 'pizza party' or equivalent social gathering per year" to be attended by the two cybersecurity teams. DHS says it will actively solicit feedback about the goals in the coming months and has set up a GitHub discussions page. The department's next plan is to roll out cybersecurity goals tailored to each sector of critical infrastructure in conjunction with the agencies closest to each sector, such as the Environmental Protection Agency for water systems.


Will Twitter Sink or Swim Under Elon Musk's Direction?

Musk's accompanying "let that sink in" tweet could be, in terms of bang for the buck, the most groan-inducing dad joke of all time. But it shouldn't hide the very real business and security challenges facing Musk, who's already CEO of Tesla and SpaceX, as he takes the helm of a social network sporting 230 million customers. "The bird is freed," Musk tweeted late Thursday, before the $44 billion deal closed Friday, and Twitter filed for delisting from the New York Stock Exchange as it goes private. Like so much with Musk, commentators have been attempting to deduce his planned intentions on numerous fronts. Musk styles himself as a showman, having once tweeted - apparently about nothing in particular - that "the most entertaining outcome is the most likely." ... What state Twitter might be in once Musk is done with it remains unclear. Then again, when you're the richest person in the world, what's a few billion here or there, especially if it keeps people talking about you and guessing at your next move?


How to turbocharge collaboration in innovation ecosystems

Whether you call it socialization or use any other term, the human dimension of innovation is often overlooked or obscured. In part, this is because technology and the covid-19–induced migration to online platforms have garnered a great deal of attention. It’s important to remind managers that innovating as a special form of problem-solving is best tackled by empowering the workforce. Collaboration can be jump-started from many directions, but it can be only as vibrant as the company’s underlying culture of curiosity, learning, and continuous adaptation. In the Veezoo–AXA story, the formal process failed to reach a breakthrough. It was the involvement of specific individuals who were keen to see the collaboration through—often on their own terms—that led to success in building an innovation ecosystem. In fact, it is often through the behaviors and work of certain people that effective structure and discipline emerge across an ecosystem. 


Data Quality as the Centerpiece of Data Mesh

After all, data quality is always context-dependent and the domain teams will best know the business context of the data. From a data quality perspective, data mesh makes good sense as it allows data quality to be defined in a context-specific way–for example, the same data point can be considered “good” for one team but “bad” for another, depending on the context. As an example, let’s take a subscription price column with a fair amount of anomalies in it. Team A is working on cost optimization while Team B is working on churn prediction. As such, price anomalies will be more of an important data quality issue for Team B than for Team A. To make it easier to facilitate ownership between data products (which can be database tables, views, streams, CSV files, visualization dashboards, etc.), the data mesh framework suggests each product should have a Service Level Objective. This will act as a data contract, to establish and enforce the quality of the data it provides: timeliness, error rates, data types, etc.



Quote for the day:

"Humility is a great quality of leadership which derives respect and not just fear or hatred." -- Yousef Munayyer

Daily Tech Digest - October 28, 2022

Why Phishing-Resistant MFA Is on US Government Fast Track

Many government agencies employ some type of MFA. But the Biden administration's guidelines call for all agencies to implement stronger security. While legacy MFA is more secure than using a username and password, it assumes that using a second device and adding a second factor improves security. It's not that simple Most legacy MFA uses a combination of a password and a "something you have" factor. That "something you have" comes into play when implementing the second factor - a one-time code presented by either a physical token, a text message, or an email sent to the user. But adding a secondary device or channel is, at best, much harder to secure and, at worst, impossible to secure. Phishing campaigns can often phish the additional codes or conduct a man-in-the-middle attack on the authentication sequences, as made clear by recent breaches of the companies Uber and Cisco. The biggest issue, however, is that most MFA solutions rely on shared secrets, like passwords, and provide no security context that ties back to the end user and their device.


AI's true goal may no longer be intelligence

To be sure, the question of genuine intelligence does still matter to a handful of thinkers. In the past month, ZDNET has interviewed two prominent scholars who are very much concerned with that question. Yann LeCun, chief AI scientist at Facebook owner Meta Properties, spoke at length with ZDNET about a paper he put out this summer as a kind of think piece on where AI needs to go. LeCun expressed concern that the dominant work of deep learning today, if it simply pursues its present course, will not achieve what he refers to as "true" intelligence, which includes things such as an ability for a computer system to plan a course of action using common sense. LeCun expresses an engineer's concern that without true intelligence, such programs will ultimately prove brittle, meaning, they could break before they ever do what we want them to do. ... The field of AI is undergoing a shift in attitude. It used to be the case that every achievement of an AI program, no matter how good, would be received with the skeptical remark, "Well, but that doesn't mean it's intelligent." 


Building the Future of TensorFlow

We see the growth of TensorFlow not just as an achievement to celebrate, but as an opportunity to go further and deliver more value for the machine learning community. Our goal is to provide the best machine learning platform on the planet. Software that will become a new superpower in the toolbox of every developer. Software that will turn machine learning from a niche craft into an industry as mature as web development. To achieve this, we listen to the needs of our users, anticipate new industry trends, iterate on our APIs, and work to make it increasingly easy for you to innovate at scale. In the same way that TensorFlow originally helped the rise of deep learning, we want to continue to facilitate the evolution of machine learning by giving you the platform that lets you push the boundaries of what's possible. Machine learning is evolving rapidly, and so is TensorFlow. Today, we're excited to announce we've started working on the next iteration of TensorFlow that will enable the next decade of machine learning development. We are building on TensorFlow's class-leading capabilities, and focusing on four pillars.


Europe Prepares to Rewrite the Rules of the Internet

Next week, a law takes effect that will change the internet forever—and make it much more difficult to be a tech giant. On November 1, the European Union’s Digital Markets Act comes into force, starting the clock on a process expected to force Amazon, Google, and Meta to make their platforms more open and interoperable in 2023. That could bring major changes to what people can do with their devices and apps, in a new reminder that Europe has regulated tech companies much more actively than the US. “We expect the consequences to be significant,” says Gerard de Graaf, a veteran EU official who helped pass the DMA early this year. Last month, he became director of a new EU office in San Francisco, established in part to explain the law’s consequences to big tech companies. De Graaf says they will be forced to break open their walled gardens. “If you have an iPhone, you should be able to download apps not just from the App Store [but] from other app stores or from the internet,” de Graaf says, in a conference room with emerald green accents at the Irish consulate in San Francisco where the EU’s office is initially located. 


Data analytics pipeline best practices: Data governance

It's not surprising that all-in-one pipeline automation has become a holy grail for some platform providers. Many enterprises share the same cloud providers, the same department-level SaaSes, and the same types of de facto-standard databases. The clear logic behind an all-in-one platform like Gathr, for example, is that companies will often need the same connectors or "operators," much of the same drag-and-drop machine learning process assembly, and the same sorts of choices between, ETL, ELT and ingestion capabilities. Unifying all this functionality could mean less work for data and analytics teams. But enterprises should remember that the compulsion to subscribe to yet another SaaS extends to these platforms. Engineers in one business unit might gravitate to a Gathr, while others might favor an Alteryx to map together sources a BI platform might need, or a super SaaS like OneSaaS that allows simplified mixing and matching within the OneSaaS environment.


Study Shows Cybersecurity Hype Complicates the Security Stack, Expands the Attack Surface

According to the cybersecurity hype report, confusing marketing strategies by vendors confused most security leaders. Subsequently, 91% of decision-makers found it difficult to select cybersecurity vendors due to unclear marketing about their specific offerings. Additionally, 49% of security leaders said their organization suffers from vendor sprawl, resulting in an increased attack surface. Consequently, 92% of organizations implement a defense-in-depth strategy and have to manage between 10 and 30 different security products. Defense-in-depth aims to create more technological layers to detect, prevent, contain, remediate, and recover from attacks. In a noisy marketplace filled with unsubstantiated claims, users cannot accurately predict the effectiveness of the hyped solutions, nor do they have the time to do so. ...  “Buyers are faced with a crowded and complex market, needing to continually layer new security products into their environment to achieve defense-in-depth, assess new and emerging AI technologies, and continually re-invest in SA&T.”


The Power of Independent Thinking in Leading

The first step in thinking for oneself is self-awareness. When you understand your values, motives, and aspirations, thinking becomes automatic. Knowing your strengths and weaknesses, you can selectively apply the knowledge you gained by reading or the wisdom of others. Thinking for oneself doesn’t mean you ignore all the knowledge you have gained on the subject. Instead, you question what your current knowledge tells you. Cultivate your thinking using mental models, which explain how things work. James Clear, the author of the best-seller, Atomic Habits, describes many mental models in his blog “Mental Models: Learn How to Think Better and Gain a Mental Edge.” One of these mental models is inversion. An example of the application of inversion is to assume your most crucial project has failed six months from now and ask yourself how it could have failed. Such an exercise gives you all the things you need to look out for and plan to mitigate them for the project’s success. Thinking and doing go hand in hand. Put your thinking into action. Take the learning and refine your knowledge.


Keeping the cloud secure with sovereignty in mind

Being able to secure your cloud service supply not only requires data controls, but also access to legal controls. As such, hyperscalers have started adapting how they deploy cloud services to give nation states assurance — essentially meaning that cloud services are deployed in partnership with a local organisation. This has given a rise to sovereign partnerships that license the hyperscaler technology, and are delivered by suppliers under the local legal framework. This pragmatic approach has slowly become more common in recent months, and helps overcome many of the risks associated with using cloud, particularly its assurance of service supply. Despite this, one of the biggest barriers to cloud is the current regulatory landscape surrounding how certain sectors need to control data sovereignty and how that data is securely processed. This often requires a long list of requirements that must be fulfilled to shift services onto the cloud, which is unique for each industry.


The Arguments for Open Source in Mainframes

The arguments for OSS on the mainframe are in many cases the same as for OSS on any other platform -- more accessible, often more secure, easier to develop. “These arguments are from the same development teams who push for OSS elsewhere in the environment,” says Mike Parkin, senior technical engineer at Vulcan Cyber. “The major differences are when the implementation is specific to the mainframe environment.” ... Parkin adds there has been a trend to use mainframe platforms for virtualization, essentially replacing a rack of commodity class servers with a single Big Iron machine that can do the job more efficiently and effectively. “Those are ideal use cases for open-source software at multiple levels, from the guest operating systems to the application layers,” he says. Boris Cipot, senior security engineer at Synopsys Software Integrity Group, a provider of integrated software solutions, agrees that open source can bring fresher and better integrations into today’s working processes and tools, and enable companies to focus on their work and not re-create existing software functionality.


Why We Need A Cyber Intelligence Revolution

Unfortunately, the challenges many organizations face include narrowing down which intelligence sources they’re pulling from, how many can be leveraged at a time, and how they’re integrated into firewalls and other security solutions. No one source of threat intelligence or existing security control can successfully cover the entirety of the threat landscape. It is critical for organizations to deploy threat intelligence from multiple sources, even those that traditionally would compete with one another. These can include commercial providers, open source intelligence data, government agencies and industry sources—all working together to provide organizations with visibility into the traffic affecting their networks. The data is in and the results are clear: What we don't know in the cybersecurity world can hurt us. Thankfully, there are steps your organization—regardless of size—can take to help ensure your network, users and data are protected.



Quote for the day:

"You may be good. You may even be better than everyone esle. But without a coach you will never be as good as you could be." -- Andy Stanley

Daily Tech Digest - October 27, 2022

Network observability: What it means to vendors and to you

Network observability represents an evolution of network monitoring. Network observability solutions should dive deeper into networks, collecting a more diverse and voluminous set of data to give network teams total end-to-end visibility into operations. Those solutions should broaden their scope, looking not just at network performance, but end-user experience, business impacts, and security. Finally, network observability should focus less on tinkering with how it presents data, which ultimately forces network engineers to glean insights themselves and doing too much of the heavy lifting in their heads. Instead, network observability should emphasize actionable insights derived in a variety of ways, including AI and machine learning and low-code scripted automation. The former relies on algorithms to make tools more intelligent. Many vendors are driving toward actionable insights with AIOps, and our research shows that NetOps pros see tremendous potential with these algorithms. 


Resume makeover: Transforming a career post-mortem into a C-suite future

One trap IT leaders often fall into when seeking a new job is viewing their resume as a historical document of their career. The reality is that your resume should paint a clear picture of your career’s future, detailing your past work experience as a roadmap that leads inevitably to your next leadership gig. But striking that balance between detailing the past and mapping toward the future can be challenging, especially while keeping your resume to-the-point. ... As a general rule, a professional resume should be a concise 1-2 pages when applying for corporate roles. Recruiters read through thousands of resumes, so they’re more likely to lose focus or abandon your resume altogether if they can’t get a sense of your qualifications within the first few minutes. ... Including executive summaries and a side bar with your education, skills, and credentials is a great way to remove redundancies from your work experience, allowing you to focus on specific accomplishments at each role, while consolidating your evergreen skills, expertise, and knowledge into short and simple lists.


How to attract more women into software development

Regardless of gender, it boils down to competence and confidence, said Archana. “Building your competence is extremely important, and with that competence comes confidence. Keep learning, build your competence, be confident about yourself, and don’t be worried about too many setbacks,” she added. “When you are a subject matter expert, the agenda is almost invisible at the table because people are listening to you for your expert opinions, for your knowledge in the area. And you want respect from that.” While more could be done to encourage gender diversity, Manjunatha called for women to upskill often. “Keep yourself updated,” added Manjunatha. “Technology is constantly evolving. What got you here is not going to get you there tomorrow, so always keep yourself updated. The growth mindset and that ability to want to keep learning that’s very, very important if you’re in this space. While upskill, e-learning or retraining can be achieved without going through a certification course, Kwong noted that certification is a means to benchmark one’s competency and skillsets.


Australia seeks stiffer penalty for data breaches

Following the update, companies found to have committed the breaches will be fined AU$50 million, or three times the value of any benefit it obtained through the misuse of information, or 30% of the company's adjusted turnover in the relevant period, whichever is greater. The Bill also will afford the Australian Information Commissioner "greater power" to resolve privacy breaches as well as strengthen the Notifiable Data Breaches scheme, which will provide the Commissioner with full knowledge of information that compromised in a breach so it can assess the risks of harm to affected individuals. In addition, the Commissioner as and Australian Communications and Media Authority will be better empowered to share information in the event of a data breach. Dreyfus said: "When Australians are asked to hand over their personal data they have a right to expect it will be protected. Unfortunately, significant privacy breaches in recent weeks have shown existing safeguards are inadequate. 


Does Your Database Really Need to Move to the Cloud?

When it comes to global-scale, multicloud and hybrid use cases, it’s important to consider how you ensure data remains consistent across regions while ensuring applications are running as quickly as possible, Powers added. Redis Enterprise offers Active-Active Geo Distribution, to allow local speed read and writes while ensuring consistent data is replicated across regions, with less than a millisecond of latency. So, even if the long-term goal is full application modernization, Powers said, “There are places where you can still use Oracle or MySQL, and patch us alongside, to fix it in the interim, while you’re making these transitions.” In these cases, he argued, “The modernization is around speed, it’s around scale, it’s around total cost of ownership.” So, the question of how to modernize your database becomes far more nuanced than whether you can afford the time and money to embark on a complete refactoring and re-platforming project.


What challenges are hardest to avoid when managing data in the cloud?

Data is moving to the cloud because it is an excellent place to store, manage, and analyze data. The cloud breaks down information silos that exist in on-premises computing, making it much easier to share data internally and with business partners and customers. However, when you put all your data in one place, you also must implement safeguards that govern the use of the data — most importantly data access control. This has proven to be a challenge for technology vendors and for the organizations that are managing their data in the cloud. The underlying problem is caused by SQL. The industry-standard database query language is a core element of the Modern Data Stack, which is the ecosystem of technologies that enable us to manage data in the cloud. But while SQL is great for business analytics, it cannot support the complex, graph-oriented relationships required for data governance. 


From zero to 10 million lines of Kotlin

Going into this migration, we had two options: We could make it possible to write new code at Meta using Kotlin but leave most of the existing code in Java; We could attempt to convert almost all our in-house code into Kotlin. The advantage of the first option is clear — it’s much less work. But there are two notable disadvantages to this approach. First, enabling interoperability between Kotlin and Java code introduces the use of platform types in Kotlin. Platform types give rise to runtime null pointer dereferences that result in crashes instead of the static safety offered by pure Kotlin code. In some complicated cases, Kotlin’s null check elision can let nulls through and create surprising null pointer exceptions later. This could happen if, for example, Kotlin code calls a Kotlin interface implemented by a Java interface. Other issues include Java’s inability to tag type parameters as nullable (until recently), and Kotlin’s overloading rules taking nullability into account, while Java’s overloading rules do not.


Using Remote Agile Governance to Create the Culture Organisations Need

Agility is not a best practice, but a mindset for uncovering good and better practices. This requires an emergent, context specific good practice of governance. Remote:AF and Esther Derby have spent the last year working to generate a process for just this approach to governance. More and more, organisations are realising that if they truly want to change their culture, they must change their governance. Agility started with the scrum software team, spread through the IT department in the form of DevOps, and through the rest of the organisation in Business Agility - but governance has, until now, been a holdout from this evolution. As long as it remains so, it has the potential to have an out-sized impact, holding the business back from true agility. ... Firstly, let’s define what we mean by governance in the context of this article. Governance is not (just) forums, meetings, and reports. It is all the ways an organisation makes decisions to enact strategy.


Digital transformation: 4 questions to help drive momentum

Enterprises often get locked in a cycle, fixing various aspects of customer experience (CX), operational efficiency, business model innovation, etc. For example, the most commonly cited business driver for digital transformation is customer experience. But while implementing it – through mobility, front-end workflows, and chatbots, for instance – organizations realize that operational efficiency is equally important, if not more so. So they fix the backend, only to realize that they’re missing the bus on the business model. And the cycle continues, making it difficult for enterprises to scale beyond CX use cases in marketing and customer service. ... Addressing technical debt and legacy technologies is a difficult challenge. A well-defined architectural blueprint early on can enable a holistic digital transformation in the long term. It can help identify the best use cases while balancing quick wins with foundational elements. However, if you are in the middle of your journey, it’s essential to tackle the problem of technical debt to move forward.


Why Passkeys Are Better Than Passwords

With passkeys, passwords are simply no longer a threat vector. Passwords account for north of 80% of all security breaches. Passkeys mitigate this threat down to almost nothing. You can’t reuse your passkeys. You don’t have to remember them. They are generated and stored for you, so you don’t have to worry about creating and storing them yourself. You can’t be lured into giving them up because they are unique to a specific website and thus can’t be shared with a phishing website. Sensitive data associated with each passkey never leaves your device. The information is stored on your phone on a special chip (a Trusted Platform Module) that even the NSA might not be able to crack. If you register with a website using a passwordless solution like Passage, that site gets nothing but a public key, which is useless for cracking open your account. While Apple lets you share your account with others via AirDrop, you couldn’t even share the actual private key with a phishing site if you wanted to.



Quote for the day:

"One of the sad truths about leadership is that, the higher up the ladder you travel, the less you know." -- Margaret Heffernan

Daily Tech Digest - October 26, 2022

IT leaders aren't getting listened to, and now they're ready to walk away

Vijay Sundaram, chief strategy officer of Zoho Corporation, said even though IT teams have been "indispensable to business innovation and continuity" in recent years, senior management continue to overlook their input in larger business decisions. This is despite the fact that 88% of respondents believe IT is more responsible for business innovation than ever before, while 85% agree IT could drive even greater innovation in the business if they had a stronger leadership position. Sundaram noted that the role of IT within organizations would become increasingly important as hybrid working and decentralized teams became mainstream. Indeed, 99% of survey respondents said their organization had already moved to a hybrid model. "This will require the expertise and involvement of ITDMs to identify appropriate technologies and meet corporate guidelines in areas like compliance, privacy and security," he added.


How Will AI Technology Change Leadership In The Future?

Data-driven and AI-minded CFOs are already using AI technologies as they predict and report on financial performance, growth plans, fiscal compliance and operating expenses. Even Microsoft Excel spreadsheets are tapping into the power of AI for visualization, dynamic arrays and queries. New AI and machine learning modeling techniques like forecasting, budgeting and investing will shape the nature of finance and its structure. Finance and accounting teams may no longer need accounting clerks to scan invoices and do manual data entry due to intelligent document processing and RPA systems that can automate repetitive tasks. CFOs who incorporate AI in their work are in a powerful position to link predictive analytics with customer behavior. This can result in pricing changes and higher profitability as well as fraud prevention. EY reports that U.S. companies have about $100 billion of bad debt (customers who will be late in paying or will not pay at all) and that a variety of AI tools can be used to remedy that. 


How to Keep Distractions From Hampering IT Staff Performance

It’s important to remember that staff members can't always be totally focused. “Assume there will be times when teams will seek distractions,” Stockall says. Stockall suggests holding fireside chats that allow team members to ask questions within an informal setting. “By stepping out of the formalized communication channels, leaders can hear firsthand what their employees want to learn more about as well as hear direct feedback on how the business can do better.” Jabes believes that when team members are given a set amount of time to complete a task, they're less likely to get sidetracked. .... Breaking down tasks into smaller pieces can make it easier to stay focused on a task, Jabes adds. Enterprises are finally beginning to understand that they need to treat their employees as individuals first, Galperin says. “When employees feel empowered to contribute to the creation of their ideal work environment, they are more motivated and inspired to give the company their best,” she notes. “The right environment will foster focus and high performance.”


How to navigate the current 5G and IoT threat landscape

For IoT devices, Arora recommends the use of networking segmentation and slicing to keep devices segregated from potential threats. He also emphasized the criticality of a differentiated implementation plan, IPS/IDS systems designed to protect IoT devices and their respective networks, and a thorough and periodic risk review. I would also urge companies to routinely patch and update IoT devices, utilize strong password measures and avoid authenticating to company systems or transmitting data over public networks. Where possible, implement device tracking and monitoring, and always utilize an employee check-in and check-out process for handing out IoT devices and returning them. Be sure to confirm terminated employees have no such devices remaining in their possession as well. Any given information set is only going to be as valuable as when it was last released, updated or examined. Threat vectors continually evolve and new risk variants are inevitable, so make sure to subscribe to vendor alerts and newsletters and stay up on the latest developments and terms.


India's Revolutionary ONDC Policy On Hold - Thanks To Data Privacy Issues

The policy is still being developed, but it is anticipated to cover a number of important issues, such as data protection, utilizing platforms with foreign ownership, and supporting domestic e-commerce companies. An important step toward promoting open networks for all facets of trading products and services over digital or electronic networks is the Ministry of Commerce’s recent introduction of ONDC. To the detriment of vendors, the Indian government contends that foreign-funded private businesses Flipkart and Amazon currently control the majority of the country’s e-commerce market. ... On several topics, including worries about security and data privacy, various parties, including online retailers, have been looking for clarification. According to reports, the government is also considering whether to introduce legislation to control non-personal data. The action is being taken as the government works to encourage emerging technologies like artificial intelligence and data analytics and accelerate the expansion of the nation’s digital economy.


New Method Exposes How Artificial Intelligence Works

In a surprising discovery, Jones and his collaborators from Los Alamos, Jacob Springer and Garrett Kenyon, as well as Jones’ mentor Juston Moore, applied their new network similarity metric to adversarially trained neural networks. They discovered that as the severity of the attack increases, adversarial training causes neural networks in the computer vision domain to converge to very similar data representations, regardless of network architecture. “We found that when we train neural networks to be robust against adversarial attacks, they begin to do the same things,” Jones said. There has been an extensive effort in industry and in the academic community searching for the “right architecture” for neural networks, but the Los Alamos team’s findings indicate that the introduction of adversarial training narrows this search space substantially. As a result, the AI research community may not need to spend as much time exploring new architectures, knowing that adversarial training causes diverse architectures to converge to similar solutions.


Postgres is eating relational

Of course, for many enterprise workloads, the people doing the architectures actually aren’t employed by the enterprise but get engaged as consultants. Within the largest global system integrators, there’s that built-in relational experience and, from my conversations with folks in the industry, this tends to be their primary reason for pushing PostgreSQL. During and after the pandemic, there has been huge demand to modernize enterprise infrastructure to make enterprises more agile and responsive to rapidly evolving customer requirements. Those global system integrators take the modernization projects and often apply the technologies that are easiest for them to deploy, netting them the best margins on their services. We can argue about whether this is actually the best thing for customers wanting to modernize, but it’s not hard to understand the underlying logic. Now, if you’re me, working for a document database company, it’s fair to think this apparent overreliance on relational is more due to inertia than a concerted attempt to embrace modern data infrastructure.


Bernd Greifeneder – unifying data for maximised visibility and intelligence

Interestingly, the biggest challenge is not technology, despite building technology that no one has done before. The hardest challenge is always figuring out how to get the right talent in the right areas. This goes back to this need to change the organisation with every doubling in size, which has been key to retaining our entrepreneurial notion. I realised that as we grow towards around 300 people, this doubling in size was relatively easy, because it was the founding team and some of the first employees, and we all have this entrepreneurial attitude and desire to get things done and better than the competition. But then, as you keep hiring and hiring, and the new hires onboard newer hires, suddenly, with 300 people and beyond that becomes really hard. At 500 people, something hit me – the new guys had no clue anymore of who we are, why they come to the office every day in the morning, what motivates them. Not even the mentorship programs we had in place worked. So, I need to make sure that we are explicit. And then it took me a while to figure out how can I make it explicit? 


5 Ways Banks Can Use Blockchain To Improve ESG Efforts

For ESG monitoring purposes, one of blockchain’s primary uses is bringing a bird’s-eye view to supply chain management. A more sustainable, energy-efficient supply chain could deliver profound savings in transportation costs and the concurrent curtailing of carbon emissions that a better managed, more efficient system would bring. With distributed ledger technology, transactions at every step of the supply chain can be recorded and distributed. This brings an unheard-of level of transparency and traceability to the movement of goods around the globe. With automated IoT interfaces, data collection is seamless and less contingent on overworked individuals. This transparency also makes monitoring ethical sourcing in industries that have long presented a challenge to regulators, such as seafood harvesting, more attainable. Products, whether raw or processed, can be tracked early in the production cycle, and the information is available to end users long before items are even delivered—with the journey tracked in real time via blockchain.


Edge and cloud: 4 reasons to adopt both

Edge and cloud computing options each have their unique advantages, and the ideal solution for your team will depend on factors that are pertinent to your industry and organization. It’s essential to weigh the pros and cons carefully and to be mindful of the implications for issues like data security and regulatory compliance, which can vary considerably by industry and the operations you support. CIOs who haven’t been involved in a hybrid computing strategy implementation before may want to consider working with a consultant or managed services provider who has experience with a project of this type. Someone who has handled similar implementations can provide insight and advice to help you realize the full benefits and avoid pitfalls. Remember that a hybrid strategy can allow your organization to achieve performance levels that drive innovation and attract and retain customers while bringing products to market more quickly, conserving resources, and adapting to changing workforce needs. 



Quote for the day:

"It is better to look ahead and prepare than to look back and regret." -- Jackie Joyner-Kersee

Daily Tech Digest - October 25, 2022

Digital-first businesses more willing to accept some fraud

“For new companies, it’s about growth – fraud in that regard can be something like you have a promotion on and people are signing up for multiple accounts. “Digital transformers are also trying to compete, so accessibility, speed, low friction and completion rates rank above compliance. Ten years ago, compliance was higher, but for digital-first companies, user experience concerns are at the top of the agenda.” However, Li stressed that this was not to say that businesses are neglecting their legal compliance obligations, but more that they simply would not survive if they provided an archaic experience that caused friction for the potential customer, such as demanding they send notarised documents through the mail. Indeed, nearly half – 46% – of respondents did feel that their customer onboarding process was still too complex, rising to 55% in the UK. Frequent complaints were that it took too long to review and verify customers when onboarding them, leading to user drop-off, increased costs, and lost revenues.


Enhance Data Analytics with oneDAL

Intel® oneAPI Data Analytics Library (oneDAL) is a library with all the building blocks required to create distributed-data pipelines to transform, process, and model data. complete with all the architectural flexibility of oneAPI. This can be achieved using Intel® Distribution for Python*, C++, or Java APIs that can connect to familiar data sources such as Spark* and Hadoop*. ... oneDAL has tools for transferring out-of-memory data sources, such as databases and text files, into memory for use in analysis, training, or prediction stages. And if the data source cannot fit into memory, the algorithms in oneDAL also support streaming data into memory. Data scientists often spend large amounts of time preparing the data for analysis or machine learning (ML). This includes converting data to numeric representation, adding or removing data, normalizing it, or computing statistics. oneDAL offers algorithms that accelerate these preparation tasks, speeding the turnaround of steps that are often performed interactively.


Google Unveils Its Latest Voice Innovations

Since releasing its first speech patent in 2001, Google has led the way in voice innovation. From interacting with Google Assistant to live captioning in Google Meet, it now boasts an extensive voice suite of tools. Within this are two core innovations: its Speech-to-Text and Text-to-Speech APIs. The Speech-to-Text API supports short and long form speech in over 75 languages and 120+ locales – out-of-the-box – without the need for training and customization. Of course, for some use cases, businesses may demand customization. As such, the API is flexible, allowing users to harness it across various audio channels. It also detects multiple speakers in the same channel, with the solution recognizing their unique voices. ... Moreover, companies can create captions and subtitles for media content or build a virtual agent. Yet, it is also possible to use the technology for speech analysis, summarization, and extraction – each of which has significant potential for contact centers. In tandem, many businesses harness Google’s Text-to-Speech API to communicate with their users. It allows them to take text and synthesize it into audio in a single step.


Why Sensors Are Key IoT Cybersecurity

Sensors enabled by the Internet of Things are network-connected smart devices that collect and transmit real-time data about their environment. The data they provide lets people make better-informed decisions. The use of IoT sensors has grown explosively in recent years because their increasing functionality, small size, and low power consumption allow designers to deploy them in new applications to increase productivity and lower costs. The sensors are being used in new ways to maximize the capabilities of network-connected systems and infrastructure. The sensors are poised for mass-scale adoption in fields including automotive, health care, industrial automation, energy, and smart cities. But the lack of standardization in IoT sensors, coupled with interoperability challenges, has made them vulnerable to cyberattacks—which creates barriers for their ubiquitous use. Hackers are targeting IoT sensors in greater numbers, in more industries, and with increased sophistication.


Transforming Observability

Digital transformation, product and technology leaders see value in observability because of its potential to measure digital experiences and measure the performance of business and digital services. To do this requires observability to meet three significant challenges. First, observability must effectively cross the complex boundaries of microservices, containers, cloud and traditional applications, multiple cloud providers, database sources, SaaS services, infrastructure and internal and external APIs. Today’s challenge is far beyond the central aggregation of large volumes of log data and suppressing non-essential alerts. Most enterprise architectures look eerily similar to a breadboard wiring project with applications, systems and data sources crisscrossing each other, representing the various pathways and interfaces across systems. Virtually any of these elements could contribute to the degradation of a digital experience, and observability must operate across these elements whether they live in our tightly controlled data centers or are distributed in microservices, cloud services or third-party interfaces.


Web 3.0 and the Crowdpoint Constellation

Web 3.0 is about the individual. The underlying technologies that will enable it are personal identification technologies (biometrics), the blockchain and distributed data technology. Let’s not worry about how, right now, let’s just paint the picture. Web 2.0 was all about exploiting data — a great deal of which was your data. The big web businesses mined it to their great enrichment, with the best AI tools known to man. However, it is equally possible for people to band together and mine their collective personal data to their own benefit. This has not yet happened, but the technologies mentioned above make it possible. Now if it were up to the individual to do this on their own initiative, of course, probably nothing would happen. ... If you’ve been tracking the evolution of the blockchain world you will realize that it has evolved a long way beyond the creation and marketing of cryptocurrencies. It is no longer all about speculation. It has stepped boldly into the financial sector, with the creation of services that are commonly described as Open Fi (Open Finance) or De Fi (Decentralized Finance).


Improving finance and accounting software with AI

Starting with audit analytics, auditors tend to spend too much time buried in compliance checklists and creating reports that few people read, with little time to seek anomalies in every transaction. Rather than manually sampling data points, Forrester says machine learning is being used for risk assessment of transactions. The member-based industry association American Institute of Certified Public Accountants (AICPA) is developing guidance for ML in the audit function. Mature audit support providers such as Thomson Reuters and Wolters Kluwer, as well as emerging companies like Caseworks Cloud and MindBridge, are embedding AI into their audit platforms. ... Starting with audit analytics, auditors tend to spend too much time buried in compliance checklists and creating reports that few people read, with little time to seek anomalies in every transaction. Rather than manually sampling data points, Forrester says machine learning is being used for risk assessment of transactions. The member-based industry association American Institute of Certified Public Accountants (AICPA) is developing guidance for ML in the audit function. 


Atlassian Vulnerabilities Highlight Criticality of Cloud Services

The combination of the two flaws could allow a significant attack, says Jake Shafer, a security consultant with Bishop Fox, who found the flaws. "Using the authorization finding would allow a low-privileged user to elevate their role to super admin which, in terms of information disclosure, would allow the attacker to gain access to everything the client of the SaaS had in their Jira deployment," he says. "From there, the attacker could then leverage the SSRF finding to go after the infrastructure of Atlassian themselves." Both vulnerabilities have been patched — the first within a week and the second within a month, according to the disclosure timeline published by Bishop Fox. However, companies should note that the increasing reliance on cloud applications has made attacks on cloud services and workloads much more common, so much so that the top class of vulnerability, according to the Open Web Application Security Project (OWASP), is broken authentication and access-control issues.


When CISOs are doomed to fail, and how to improve your chances of success

Sometimes, CISO candidates can spot a bad employer during the interview process. "You are not only trying to convince them that you are the person they should hire, but you are interviewing them," Callas says. The recruiting process is just like zero-knowledge proof, because neither side wants to be upfront about what is going on. One of Callas's priorities is to learn how much the company cares about security, and he does that by asking direct questions. One time, an executive he talked to admitted that management did not want better protection. A typical question potential CISOs are asked is what they might do in a difficult situation such as a breach. When Callas hears this, he smiles and says: “Has this actually happened?” Sometimes they'll say, 'Oh, no, no, no,' in a way that you know means yes," he adds, "and every so often, you get the person who looks around and says: 'Let me tell you what's really going on.'" Another priority should be understanding to whom the CISO reports: the CEO, the CFO, the CTO, or even the legal department. “[This] tells you a little bit about what they expect you to do," says Chip Gibbons, CISO at Thrive.


Why Functional Programming Should Be The Future Of Software Development

Pure functional programming solves many of our industry’s biggest problems by removing dangerous features from the language, making it harder for developers to shoot themselves in the foot. At first, these limitations may seem drastic, as I’m sure the 1960s developers felt regarding the removal of GOTO. But the fact of the matter is that it’s both liberating and empowering to work in these languages—so much so that nearly all of today’s most popular languages have incorporated functional features, although they remain fundamentally imperative languages. The biggest problem with this hybrid approach is that it still allows developers to ignore the functional aspects of the language. Had we left GOTO as an option 50 years ago, we might still be struggling with spaghetti code today. To reap the full benefits of pure functional programming languages, you can’t compromise. You need to use languages that were designed with these principles from the start. Only by adopting them will you get the many benefits that I’ve outlined here. But functional programming isn’t a bed of roses. It comes at a cost.



Quote for the day:

"Make heroes out of the employees who personify what you want to see in the organization." -- Anita Roddick