Showing posts with label gadgets. Show all posts
Showing posts with label gadgets. Show all posts

Daily Tech Digest - March 18, 2024

Generative AI will turn cybercriminals into better con artists. AI will help attackers to craft well-written, convincing phishing emails and websites in different languages, enabling them to widen the nets of their campaigns across locales. We expect to see the quality of social engineering attacks improve, making lures more difficult for targets and security teams to spot. As a result, we may see an increase in the risks and harms associated with social engineering – from fraud to network intrusions. ... AI is driving the democratisation of technology by helping less skilled users to carry out more complex tasks more efficiently. But while AI improves organisations’ defensive capabilities, it also has the potential for helping malicious actors carry out attacks against lower system layers, namely firmware and hardware, where attack efforts have been on the rise in recent years. Historically, such attacks required extensive technical expertise, but AI is beginning to show promise to lower these barriers. This could lead to more efforts to exploit systems at the lower level, giving attackers a foothold below the operating system and the industry’s best software security defences.


Get the Value Out of Your Data

A robust data strategy should have clearly defined outcomes and measurements in place to trace the value it delivers. However, it is important to acknowledge the need for flexibility during the strategic and operational phases. Consequently, defining deliverables becomes crucial to ensure transparency in the delivery process. To achieve this, adopting a data product approach focused on iteratively delivering value to your organization is recommended. The evolution of DevOps, supported by cloud platform technology, has significantly improved the software engineering delivery process by automating development and operational routines. Now, we are witnessing a similar agile evolution in the data management area with the emergence of DataOps. DataOps aims to enhance the speed and quality of data delivery, foster collaboration between IT and business teams, and reduce the associated time and costs. By providing a unified view of data across the organization, DataOps enables faster and more confident data-driven decision-making, ensuring data accuracy, up-to-datedness, and security. It automates and brings transparency to the measurements required for agile delivery through data product management.


Exposure to new workplace technologies linked to lower quality of life

Part of the problem is that IT workers need to stay updated with the newest tech trends and figure out how to use them at work, said Ryan Smith, founder of the tech firm QFunction, also unconnected with the study. The hard part is that new tech keeps coming in, and workers have to learn it, set it up, and help others use it quickly, he said. “With the rise of AI and machine learning and the uncertainty around it, being asked to come up to speed with it and how to best utilize it so quickly, all while having to support your other numerous IT tasks, is exhausting,” he added. “On top of this, the constant fear of layoffs in the job market forces IT workers to keep up with the latest technology trends in order to stay employable, which can negatively affect their quality of life.” ... “As IT has become the backbone of many businesses, that backbone is key to the businesses operations, and in most cases revenue,” he added. “That means it’s key to the business’s survival. IT teams now must be accessible 24 hours a day. In the face of a problem, they are expected to work 24 hours a day to resolve it. ...”


6 best operating systems for Raspberry Pi 5

Even though it has been nearly seven years since Microsoft debuted Windows on Arm, there has been a noticeable lack of ARM-powered laptops. The situation is even worse for SBCs like the Raspberry Pi, which aren’t even on Microsoft’s radar. Luckily, the talented team at WoR project managed to find a way to install Windows 11 on Raspberry Pi boards. ... Finally, we have the Raspberry Pi OS, which has been developed specifically for the RPi boards. Since its debut in 2012, the Raspberry Pi OS (formerly Raspbian) has become the operating system of choice for many RPi board users. Since it was hand-crafted for the Raspberry Pi SBCs, it’s faster than Ubuntu and light years ahead of Windows 11 in terms of performance. Moreover, most projects tend to favor Raspberry Pi OS over the alternatives. So, it’s possible to run into compatibility and stability issues if you attempt to use any other operating system when attempting to replicate the projects created by the lively Raspberry Pi community. You won’t be disappointed with the Raspberry Pi OS if you prefer a more minimalist UI. That said, despite including pretty much everything you need to use to make the most of your RPi SBC, the Raspberry Pi OS isn't as user-friendly as Ubuntu.


Speaking without vocal cords, thanks to a new AI-assisted wearable device

The breakthrough is the latest in Chen's efforts to help those with disabilities. His team previously developed a wearable glove capable of translating American Sign Language into English speech in real time to help users of ASL communicate with those who don't know how to sign. The tiny new patch-like device is made up of two components. One, a self-powered sensing component, detects and converts signals generated by muscle movements into high-fidelity, analyzable electrical signals; these electrical signals are then translated into speech signals using a machine-learning algorithm. The other, an actuation component, turns those speech signals into the desired voice expression. The two components each contain two layers: a layer of biocompatible silicone compound polydimethylsiloxane, or PDMS, with elastic properties, and a magnetic induction layer made of copper induction coils. Sandwiched between the two components is a fifth layer containing PDMS mixed with micromagnets, which generates a magnetic field. Utilizing a soft magnetoelastic sensing mechanism developed by Chen's team in 2021, the device is capable of detecting changes in the magnetic field when it is altered as a result of mechanical forces—in this case, the movement of laryngeal muscles.


We can’t close the digital divide alone, says Cisco HR head as she discusses growth initiatives

At Cisco, we follow a strengths-based approach to learning and development, wherein our quarterly development discussions extend beyond performance evaluations to uplifting ourselves and our teams. We understand that a one-size-fits-all approach is inadequate. To best play to our employees' strengths, we have to be flexible, adaptable, and open to what works best for each individual and team. This enables us to understand individual employees' unique learning needs, enabling us to tailor personalised programs that encompass diverse learning options such as online courses, workshops, mentoring, and gamified experiences, catering to diverse learning styles. As a result, our employees are energized to pursue their passions, contributing their best selves to the workplace. Measuring the quality of work, internal movements, employee retention, patents, and innovation, along with engagement pulse assessments, allows us to gauge the effectiveness of our programs. When it comes to addressing the challenge of retaining talent, it's essential for HR leaders to consider a holistic approach. 


Vector databases: Shiny object syndrome and the case of a missing unicorn

What’s up with vector databases, anyway? They’re all about information retrieval, but let’s be real, that’s nothing new, even though it may feel like it with all the hype around it. We’ve got SQL databases, NoSQL databases, full-text search apps and vector libraries already tackling that job. Sure, vector databases offer semantic retrieval, which is great, but SQL databases like Singlestore and Postgres (with the pgvector extension) can handle semantic retrieval too, all while providing standard DB features like ACID. Full-text search applications like Apache Solr, Elasticsearch and OpenSearch also rock the vector search scene, along with search products like Coveo, and bring some serious text-processing capabilities for hybrid searching. But here’s the thing about vector databases: They’re kind of stuck in the middle. ... It wasn’t that early either — Weaviate, Vespa and Mivlus were already around with their vector DB offerings, and Elasticsearch, OpenSearch and Solr were ready around the same time. When technology isn’t your differentiator, opt for hype. Pinecone’s $100 million Series B funding was led by Andreessen Horowitz, which in many ways is living by the playbook it created for the boom times in tech.


The Role of Quantum Computing in Data Science

Despite its potential, the transition to quantum computing presents several significant challenges to overcome. Quantum computers are highly sensitive to their environment, with qubit states easily disturbed by external influences – a problem known as quantum decoherence. This sensitivity requires that quantum computers be kept in highly controlled conditions, which can be expensive and technologically demanding. Moreover, concerns about the future cost implications of quantum computing on software and services are emerging. Ultimately, the prices will be sky-high, and we might be forced to search for AWS alternatives, especially if they raise their prices due to the introduction of quantum features, as it’s the case with Microsoft banking everything on AI. This raises the question of how quantum computing will alter the prices and features of both consumer and enterprise software and services, further highlighting the need for a careful balance between innovation and accessibility. There’s also a steep learning curve for data scientists to adapt to quantum computing.


AI-Driven API and Microservice Architecture Design for Cloud

Implementing AI-based continuous optimization for APIs and microservices in Azure involves using artificial intelligence to dynamically improve performance, efficiency, and user experience over time. Here's how you can achieve continuous optimization with AI in Azure:Performance monitoring: Implement AI-powered monitoring tools to continuously track key performance metrics such as response times, error rates, and resource utilization for APIs and microservices in real time. Automated tuning: Utilize machine learning algorithms to analyze performance data and automatically adjust configuration settings, such as resource allocation, caching strategies, or database queries, to optimize performance. Dynamic scaling: Leverage AI-driven scaling mechanisms to adjust the number of instances hosting APIs and microservices based on real-time demand and predicted workload trends, ensuring efficient resource allocation and responsiveness. Cost optimization: Use AI algorithms to analyze cost patterns and resource utilization data to identify opportunities for cost savings, such as optimizing resource allocation, implementing serverless architectures, or leveraging reserved instances.


4 ways AI is contributing to bias in the workplace

Generative AI tools are often used to screen and rank candidates, create resumes and cover letters, and summarize several files simultaneously. But AIs are only as good as the data they're trained on. GPT-3.5 was trained on massive amounts of widely available information online, including books, articles, and social media. Access to this online data will inevitably reflect societal inequities and historical biases, as shown in the training data, which the AI bot inherits and replicates to some degree. No one using AI should assume these tools are inherently objective because they're trained on large amounts of data from different sources. While generative AI bots can be useful, we should not underestimate the risk of bias in an automated hiring process -- and that reality is crucial for recruiters, HR professionals, and managers. Another study found racial bias is present in facial-recognition technologies that show lower accuracy rates for dark-skinned individuals. Something as simple as data for demographic distributions in ZIP codes being used to train AI models, for example, can result in decisions that disproportionately affect people from certain racial backgrounds.



Quote for the day:

"The most common way people give up their power is by thinking they don't have any." -- Alice Walker

Daily Tech Digest - June 27, 2022

Collaboration Is a Key Skill. So Why Aren’t We Teaching It?

Beyond an organization’s bottom line, positive workplace relationships matter to individuals’ well-being. Whether respondents’ relationships with their most liked, least liked, or most influential collaborators were being rated, the quality of their collaborative relationships positively predicted job satisfaction, good mental health, and positive attitudes about workplace collaboration. Having even one low-quality collaborative relationship may drive undesirable outcomes, including poor mental health that contributes to burnout, and job dissatisfaction that contributes to turnover. Given that collaborative relationship quality is important both to individuals and to bottom lines, why don’t organizations provide more opportunities for people to develop collaborative skills? It could be that companies do, in fact, make development opportunities available but that individuals fail to see those opportunities as either available or related to collaboration. Or it could be that such offerings are precluded by underlying assumptions that people pick up relationship skills via osmosis rather than direct training, that they are just naturally “good” or “not good” at relationships, or that these skills cannot be learned.


The Best Raspberry Pi 4 Alternatives

The Tinkerboard’s processor is more powerful than the one you’ll find in the Pi 4 B, so you may be able to get even more ambitious with your builds. However, when they’re available, you can get Pi 4s with up to 8 GB of RAM, which is more than the 2 GB that the Tinkerboard offers. Then there is the price. You can pick up a Tinkerboard S R2.0 on Amazon for $149.99 — which is more than some of the inflated Pi 4s are currently selling for. In short, this is a good option if you need more processing power or you can’t find a Pi 4, even at a premium. ... The Linux-powered ODROID XU4Q benefits from “Samsung Exynos5422 Cortex-A15 2Ghz and Cortex-A7 Octa core CPUs” along with 2GB of DDR3 RAM. On paper, this potentially makes the UX4Q the most powerful micro-computer on this list. It also comes with a very large heatsink attached, presumably to soak up some of the heat from its relatively powerful processor. With regards to ports, ODROID has managed to cram two USB 3.0, one USB 2.0, a Gigabit Ethernet, and an HDMI port onto the tiny board.


Google’s AI spotlights a human cognitive glitch: mistaking fluent speech for fluent thought

The human brain is hardwired to infer intentions behind words. Every time you engage in conversation, your mind automatically constructs a mental model of your conversation partner. You then use the words they say to fill in the model with that person’s goals, feelings and beliefs. The process of jumping from words to the mental model is seamless, getting triggered every time you receive a fully fledged sentence. This cognitive process saves you a lot of time and effort in everyday life, greatly facilitating your social interactions. However, in the case of AI systems, it misfires – building a mental model out of thin air. A little more probing can reveal the severity of this misfire. Consider the following prompt: “Peanut butter and feathers taste great together because___”. GPT-3 continued: “Peanut butter and feathers taste great together because they both have a nutty flavor. Peanut butter is also smooth and creamy, which helps to offset the feather’s texture.” The text in this case is as fluent as our example with pineapples, but this time the model is saying something decidedly less sensible.


Cloudflare's outage was human error. There's a way to make tech divinely forgive

What's the lesson? It's not news that people make mistakes, and the more baroque things become, the harder they are to guard against. It's just that what gets advertised on BGP isn't just routes but things crapping out, and when you're Cloudflare that's what the C in CDN becomes. It's not the first time it's happened, nor the last, and one trusts the company will hire a choreographer to prevent further op-on-op stompfests. Yet if it happens, and keeps happening, why aren't systems more resilient to this sort of problem? You can argue that highly dynamic and structurally fluid routing mechanisms can't be algorithmically or procedurally safeguarded, and we're always going to live in the zone where the benefits of pushing just a bit too hard for performance is worth the occasional chaotic hour. That's defeatist talk, soldier. There's another way to protect against the unexpected misfire, other than predicting or excluding. You'll be using it already in different guises, some of which have been around since the dawn of computer time: state snapshotting. 


Stronger Security for Smart Devices To Efficiently Protect Against Powerful Hacker

AttacksResearchers are racing against hackers to develop stronger protections that keep data safe from malicious agents who would steal information by eavesdropping on smart devices. Much of the effort into preventing these “side-channel attacks” has focused on the vulnerability of digital processors. Hackers, for example, can measure the electric current drawn by a smartwatch’s CPU and use it to reconstruct secret data being processed, such as a password. MIT researchers recently published a paper in the IEEE Journal of Solid-State Circuits, which demonstrated that analog-to-digital converters in smart devices, which encode real-world signals from sensors into digital values that can be processed computationally, are vulnerable to power side-channel attacks. A hacker could measure the power supply current of the analog-to-digital converter and use machine learning algorithms to accurately reconstruct output data. Now, in two new research papers, engineers show that analog-to-digital converters are also susceptible to a stealthier form of side-channel attack, and describe techniques that effectively block both attacks. 


What is AI governance?

It can be helpful to break apart the governance of AI algorithms into layers. At the lowest-level, close to the process are the rules of which humans have control over the training, retraining and deployment. The issues of accessibility and accountability are largely practical and implemented to prevent unknowns from changing the algorithm or its training set, perhaps maliciously. At the next level, there are questions about the enterprise that is running the AI algorithm. The corporate hierarchy that controls all actions of the corporation is naturally part of the AI governance because the curators of the AI fall into the normal reporting structure. Some companies are setting up special committees to consider ethical, legal and political aspects of governing the AI. Each entity also exists as part of a larger society. Many of the societal rule making bodies are turning their attention to AI algorithms. Some are simply industry-wide coalitions or committees. Some are local or national governments and others are nongovernmental organizations. All of these groups are often talking about passing laws or creating rules for how AI can be leashed.


Continuous Operations is the Unsung Hero of DevOps

For continuous operations to be successful, you must have infrastructure automation in place. In fact, continuous operations cannot exist without infrastructure automation. The true value that arises from the combination of infrastructure automation and continuous operations is that it gives back IT operations teams their time so they can focus on more complex reasoning or problem-solving tasks while the system simply continuously scans and fixes errors. ... The very essence of DevOps is constant change. Continuous operations may ultimately return your infrastructure to its desired state, but philosophically, it’s about being able to quickly and securely identify anomalies, apply fixes and modify your infrastructure as quickly as possible. It’s not as simple as flipping a switch or pushing a line of code. As the demand for security and compliance swells, continuous operations will have to build in these elements to be de facto checkboxes in the loop. At Puppet, we’ve baked continuous compliance and security into our infrastructure automation products to ensure continuous operations are indeed continuous. 


OT security: Helping under-resourced critical infrastructure organizations

The biggest problem in OT security is the cultural divide between IT and OT. IT security is a mature field, with standards, frameworks, and an abundance of mature and emerging technologies. The OT security field is much less mature, lacking people with OT security experience, established best practices and frameworks, and with a much smaller selection of security technologies. Historically, IT and OT have worked independently on security, with OT engineers overseeing security in the OT environment where it was not as critical due to lack of or limited connectivity to the internet and to the enterprise. Today, however, most OT environments are connected to the enterprise IT environment and to the internet. The benefits of Industry 4.0 and digital transformation in OT has accelerated connectivity in OT, including to cloud environments. The prevalence of converged IT/OT environments makes it imperative that IT and OT teams work together to secure them. The problem is that cultural divide. The good news is that it can be conquered, by bringing the two teams together to create an OT security strategy that is owned jointly by both teams. 


Why to Create a More Data-Conscious Company Culture

A data culture creates standards for employee data literacy and provides open and transparent access to what assets exist, as well as standards for curation, quality, and certification so employees have a shared understanding of the data within an organization. “This will not resolve the silos, but it will create a transparent view of the entire enterprise data fabric,” Wills explains. He adds some of the approaches Alation has seen work well include things like providing an enterprise-wide data literacy training and certification program, so that everyone shares the same perspective, vocabulary, and basic analytic skills. Each functional business unit and area should include data training as part of their employee onboarding as it provides a review of an organization’s authoritative data and data-related assets, the process used to maintain them, and sets expectations for how employees should participate. “Also, recognition: Nothing motivates more and sends a stronger message than employees seeing each other be recognized and rewarded for their contributions,” Wills says. 


Valuing commercial real estate in the metaverse

One of the most obvious ways how digital real estate diverges from its physical counterpart is in the limited utility that it provides. This, of course, is because digital products do not require storage, nor do the digital people who populate the metaverse need to be kept comfortable or warm in indoor venues. However, the sense of discovery in the search for goods and services remains genuine within the metaverse, and it is in this way that virtual utility provides the most value. As businesses are free to design their purchased real estate however they want to, they can dedicate their efforts to creating the most eye-catching and exciting facades that will entice users to discover more about their property – and ultimately the goods and services they have on offer. Therefore, it is not so much about the utility of a piece of real estate that determines its valuation – but more about its network power. For example, how easy is it to discover this real estate? How well connected is it? What is the purchase power of the people coming to the piece of real estate? In this sense, valuing real estate in the metaverse, I’d argue, is a lot more like valuing a website, i.e., how many clicks does it get? 



Quote for the day:

"Leadership - mobilization toward a common goal." -- Gary Wills

Daily Tech Digest - June 16, 2022

High-Bandwidth Memory (HBM) delivers impressive performance gains

In addition to widening the bus in order to boost bandwidth, HBM technology shrinks down the size of the memory chips and stacks them in an elegant new design form. HBM chips are tiny when compared to graphics double data rate (GDDR) memory, which it was originally designed to replace. 1GB of GDDR memory chips take up 672 square millimeters versus just 35 square millimeters for 1GB of HBM. Rather than spreading out the transistors, HBM is stacked up to 12 layers high and connected with an interconnect technology called ‘through silicon via’ (TSV). The TSV runs through the layers of HBM chips like an elevator runs through a building, greatly reducing the amount of time data bits need to travel. With the HBM sitting on the substrate right next to the CPU or GPU, less power is required to move data between CPU/GPU and memory. The CPU and HBM talk directly to each other, eliminating the need for DIMM sticks. “The whole idea that [we] had was instead of going very narrow and very fast, go very wide and very slow,” Macri said.


3 forces shaping the evolution of ERP

If there was any hesitation about moving to cloud-based ERP, it was quashed as the COVID crisis erupted, and corporate workplaces became scattered across countless home-based offices. On-premises ERP is seen as “not as scalable as people thought,” says Sharon Bhalaru, partner at accounting and technology consulting firm Armanino LLP. “We’re seeing a move to cloud-based systems,” to support remote employees who need to perform HR, financial and accounting tasks remotely. ... Next-generation ERP platforms “give companies real-time transparency with respect to sales, inventory, production, and financials,” the Boston Consulting Group analysts wrote. “Powerful data-driven analytics enables more agile decisions, such as adjustments to the supply chain to improve resilience. Robust e-commerce capabilities help companies better engage with online customers before and after a sale. And a lean ERP core and cloud-first approach increase deployment speed.” ... Unprecedented and ongoing supply chain disruptions underscore the need for greater visibility, more predictable lead times, alternative supply sources, and faster response to disruptions.


Interpol arrests thousands in global cyber fraud crackdown

The operation’s targets included telephone scammers, long-distance romance scammers, email fraudsters and other connected financial criminals, identified through a prior intelligence operation using Interpol’s secure global comms network, sharing data on suspects, suspicious bank accounts, unlawful transactions, and communications means such as phone numbers, email addresses, fake websites and IP addresses. “Telecom and BEC fraud are sources of serious concern for many countries and have a hugely damaging effect on economies, businesses and communities,” said Rory Corcoran. “The international nature of these crimes can only be addressed successfully by law enforcement working together beyond borders, which is why Interpol is critical to providing police the world over with a coordinated tactical response.” Duan Daqi, added: “The transnational and digital nature of different types of telecom and social engineering fraud continues to present grave challenges for local police authorities, because perpetrators operate from a different country or even continent than their victims and keep updating their fraud schemes.


Is Cyber Essentials Enough to Secure Your Organisation?

If you are to have confidence in your security controls, you must implement defence in depth. This requires a holistic approach to cyber security that addresses people, processes and technology. Key aspects of this aren’t addressed in Cyber Essentials, such as staff awareness training, vulnerability scanning and incident response. Employees are at the heart of any cyber security system, because they are the ones responsible for handling sensitive information. If they don’t understand their data protection requirements, it could result in disaster. Meanwhile, vulnerability scanning ensures that organisations can spot weaknesses in their systems before a cyber criminal can exploit them. It’s a more advanced form of protection than is offered with secure configuration and system updates, enabling organisations to proactively secure their systems. Conversely, incident response measures give organisations the tools they need to respond after a security incident has occurred. Most of the damage caused by a data breach occurs after the initial intrusion, so a prompt and organised response can be the difference between a minor disruption and a catastrophe.


Imagining a world without open standards

The open standard makes portability easier for software developers, provides integrators with choice in the building blocks for solutions, and enables customers to focus on solving business problems rather than integration issues. Open standards eliminate the need for organizations to expend energy wrangling with competitors on defining how systems should work, giving them the space and time to focus on building and improving how those systems actually do work. The real benefits, though, are downstream of vendors: open standards mean that businesses can effectively communicate and collaborate both internally and with peers. They mean that the expertise built up by a professional in one market or business can be taken with them wherever they want to work. They mean that a lack of knowledge resources is not the barrier that prevents businesses from making the move towards better, more efficient ways of working. In imagining a world without open standards, then, the image is one of businesses constantly having to navigate between the walled gardens of different technology vendors, reskilling and rehiring as they do so, before they can even begin the serious work of delivering value from that technology.


Good Habits That Every Programmer Should Have

We can become good at a specific technology by working with a particular technology for a long time. How can we become an expert in a specific technology? Learning internals is a great habit that supports us to become an expert in any technology. For example, after working some time with Git, you can learn Git internals via the lesser-known plumbing commands. You can make accurate technical decisions when you understand the internals of your technology stack. When you learn internals, you will indeed become more familiar with the limitations and workarounds of a specific technology. Learning internals also helps us to understand what we are doing with programming every day. Motivate everyone to learn further about their tools’ internals! ... Sometimes, we derive programming solutions from example code snippets that we can find on internet forums. It’s a good habit to give credit to other programmers’ hard work when we use their code snippets, libraries, and tools, even though their licensing documents say that attribution is not required.


Reducing Cybersecurity Security Risk From and to Third Parties

There are a number of ways in which organizations may be able to obtain attack information from third parties, if they agree. Ideally, such requirements should be included in service agreements and partnership contracts for vendors, outsourcers, and partners, as listed in the article, “Using Contracts to Reduce Cybersecurity Risks.” Employment contracts, nondisclosure agreements and license agreements may also include requirements that protect organizations against third-party risk. While it is helpful to request vendors, outsourcers and partners to commit to risk reduction in the contractual terms and conditions, it is even more beneficial for an organization to have direct access to partners’ and suppliers’ security monitoring systems. ... More modern forms of protection monitor messages for origin and content and respond with information about unauthorized sources—as with IDSs—or preventive action—as with IPSs. Advancements in these systems include observation of unusual behavior and the use of artificial intelligence (AI) to determine threats.


How Upskilling Could Resolve The Cybersecurity Skills Gap

With a shortage of new candidates, upskilling provides the answer to the cybersecurity skills gap. And it brings multiple benefits for both employees and businesses. One of the first is that, ultimately, cybersecurity is everyone’s business. From the CEO to the new employee at home, everyone has a role to play in ensuring systems are robust in the face of a growing wave of attacks. While this does not mean that everyone in a company needs to be a cybersecurity professional, it does mean that everyone should be aware of the risks, how to spot potential vulnerabilities and attacks and the practical measures they must take to prevent them. However, it can also produce a supply of cybersecurity professionals. Waiting for qualified entrants to the jobs market will take too long and, in practice, it’s likely they will not be qualified for long! The cybersecurity environment changes so rapidly, the knowledge many graduates gain at the start of their course may not be relevant by the end. Instead, identifying existing staff with the soft skills,or power skills, to develop, adapt, and learn may be the quickest and easiest path to take.


12 tips for achieving IT agility in the digital era

“If your tech stack is streamlined, easy to access, and easy to use, your workforce can quickly respond to business or customer needs seamlessly,” says Fleetcor’s duFour. Key to this is getting a handle on application sprawl by rationalizing the IT portfolio. Voya Financial’s simplification journey began with such an effort, a process that reduced its application footprint by 17% and its slate of technology tools by one quarter. The work continues as part of its cloud migration work. “This practice is instilling standards and discipline that will only help to ensure our environment remains uncluttered and contemporary for the long term,” Keshavan says. As a result, the IT group is faster and more flexible, recently deploying five new cloud services for data science and analytics developers to use within four hours —something that would have taken a cross-functional IT team several weeks to deploy in the past. Reining in application sprawl has also been valuable at Snow Software. “Oftentimes, companies and teams will invest in applications with similar purposes,” says Snow Software CIO Alastair Pooley. 


True Component-Testing of the GUI With Karate Mock Server

There’s an important reason why old-style end-to-end tests are often more expensive than needed: you tend to test paths that are not relevant to the frontend logic. Each of these adds to the total test suite run. Consider a web application for your tax return. The user journey in this non-trivial app consists of submitting a series of questionnaires, their content customized depending on what you answered in previous steps. There is likely some logic on the frontend to manage the turns in that user journey, but the number-crunching over your sources of income and deductibles surely happens on the backend. You don’t need a GUI test to validate the correctness of those calculations. With a mock backend that would be entirely pointless. You set it up to tell the frontend that the final amount to pay is 12600 Euros. You can test that this amount is properly displayed, but there’s no testing its correctness. All the decisions are made (and hopefully tested) elsewhere, so we can treat it as a hardcoded test fixture.



Quote for the day:

"Leaders begin with a different question than others. Replacing who can I blame with how am I responsible?" -- Orrin Woodward

Daily Tech Digest - May 23, 2022

Clearview AI ordered to delete facial recognition data belonging to UK residents

The ICO said Clearview violated several tenets of UK data protection law, including failing to use data in a way that is “fair and transparent” (given that residents’ images were scraped without their knowledge or consent), “failing to have a lawful reason for collecting people’s information,” and “failing to have a process in place to stop the data being retained indefinitely.” However, although ICO has issued a fine against Clearview and ordered the company to delete UK data, it’s unclear how this might be enforced if Clearview has no business or customers in the country to sanction. In response to a similar deletion order and fine issued in Italy under EU law earlier this year, Clearview’s CEO Hoan Ton-That responded that the US-based company was simply not subject to EU legislation. ... In response to the same query, Lee Wolosky of Jenner and Block, Clearview’s legal representatives, told The Verge: “While we appreciate the ICO’s desire to reduce their monetary penalty on Clearview AI, we nevertheless stand by our position that the decision to impose any fine is incorrect as a matter of law ... ”


AI for Software Developers: a Future or a New Reality?

On the one hand, AI authors don’t copy anything into the algorithm. On the other hand, the neural network is incapable of independent thinking. All the code it produces is a combination of fragments it has seen during the learning phase. It may even create pieces of code that look like exact copies from the training dataset. The point is that even pieces that look independent are no more independent than the copies. The problem is pretty new, and we haven’t seen any court decisions yet. This uncertainty slows down the progress of product developers: people don’t want to make significant investments into something that might become illegal tomorrow. We faced the same issue when creating our code completion system. In addition to the potential legal limitations, there were technical difficulties as well. The code we can find in an open-source repository is in some sense “complete”. It usually compiles, passes simple tests, has clear formatting, doesn’t contain duplicate blocks or temporary debug sections. However, the code we have to work with in the editor is not “complete” most of the time.


What is JPA? Introduction to the Jakarta Persistence API

From a programming perspective, the ORM layer is an adapter layer: it adapts the language of object graphs to the language of SQL and relational tables. The ORM layer allows object-oriented developers to build software that persists data without ever leaving the object-oriented paradigm. When you use JPA, you create a map from the datastore to your application's data model objects. Instead of defining how objects are saved and retrieved, you define the mapping between objects and your database, then invoke JPA to persist them. If you're using a relational database, much of the actual connection between your application code and the database will then be handled by JDBC. As a specification, JPA provides metadata annotations, which you use to define the mapping between objects and the database. Each JPA implementation provides its own engine for JPA annotations. The JPA spec also provides the PersistanceManager or EntityManager, which are the key points of contact with the JPA system


What’s so great about Google’s ‘translation glasses’?

Unlike Google Glass, the translation-glasses prototype is augmented reality (AR), too. Let me explain what I mean. Augmented reality happens when a device captures data from the world and, based on its recognition of what that data means, adds information to it that’s available to the user. Google Glass was not augmented reality — it was a heads-up display. The only contextual or environmental awareness it could deal with was location. Based on location, it could give turn-by-turn directions or location-based reminders. But it couldn’t normally harvest visual or audio data, then return to the user information about what they were seeing or hearing. Google’s translation glasses are, in fact, AR by essentially taking audio data from the environment and returning to the user a transcript of what’s being said in the language of choice. Audience members and the tech press reported on the translation function as the exclusive application for these glasses without any analytical or critical exploration, as far as I could tell. The most glaring fact that should have been mentioned in every report is that translation is just an arbitrary choice for processing audio data in the cloud.


Augmented reality, superhuman abilities and the future of medicine

With AR headsets and new techniques for registering 3D medical images to a patient’s real body, the superpower of x-ray vision is now a reality. In an impressive study from Teikyo University School of Medicine in Japan, an experimental emergency room was tested with the ability to capture whole-body CT scans of trauma patients and immediately allow the medical team, all wearing AR headsets, to peer into the patient on the exam table and see the trauma in the exact location where it resides. This allowed the team to discuss the injuries and plan treatment without needing to refer back and forth to flat screens, saving time, reducing distraction, and eliminating the need for mental transformations. In other words, AR technology takes medical images off the screen and places them in 3D space at the exact location where it’s most useful to doctors – perfectly aligned with the patient’s body. Such a capability is so natural and intuitive, that I predict it will be rapidly adopted across medical applications. In fact, I expect that in the early 2030s doctors will look back at the old way of doing things, glancing back and forth at flat screens, as awkward and primitive.


My Instagram account was hacked and two-factor authentication didn't help

It turns out the combination of the URL on the image and my reply gave them enough information to take over my account. Now, even when I saw trouble brewing -- an Instagram e-mail came asking me if I wanted to change my phone number to one in Nigeria -- I wasn't too worried. I'd protected my account with two-factor authentication (2FA). While 2FA isn't perfect, it's better than anything else out there for basic security. But, here's where things went awry. Instagram should have sent me an e-mail with a link asking me to "revert this change." Instagram didn't send such a message. Instead, I received e-mails from security@mail.instagram.com that provided a link about how to "secure your account." This dropped me into Instagram's pages for a hacked account, which wasn't any help. ... Argh! I followed up with Instagram's suggestions on how to bring my account back. I asked for a login link from my Android Instagram app. I got one, which didn't work. Next, I requested a security code. I got one. That didn't work either, no doubt because -- by that time -- the account was now responding to its "new" e-mail address and phone number.


What Gen Z and millennials want from employers

“The recurring theme with Gen. Z — beside the compensation piece — is the focus on workplace flexibility and mental health. Those are two places we see a huge divergence form other generations,” Remley said. “If we’d talk to Boomers or Gen Xers concerning mental health benefits, they would say that’s my business and not my employer’s business. Whereas, Gen Z is wanting assistance with mental health from their employers.” Benefits ranked high in both surveys as reasons workers are drawn to and want to remain with an organization. At the top of the list: good mental healthcare and healthcare benefits in general. And, employers do seem to be making progress when it comes to prioritizing mental health and well-being in the workplace, Deloitte reported. "More than half agree that workplace well-being and mental health has become more of a focus for their employers since the start of the pandemic. However, there are mixed reviews on whether the increased focus is actually having a positive impact," Deloitte's report stated.


Q&A: What CDW UK has planned for 2022

Innovating for sustainability will continue to be a key focus for us and our customers, and we are committed to finding new ways to help them on their journeys to net zero in any way we can. Not only does focusing on sustainability ensure business continuity by conserving resources but customers and employees want to buy from and work for companies that share their values. We believe sustainability is a shared responsibility and we want to set a strong example. Through our beGreen program, we provide coworkers with the platform to share ideas and take collective action to improve our environment. Areas of focus include coworker education, community awareness, recycling and resource conservation. The program is managed by a cross-functional team of coworkers from multiple CDW locations. This team collaborates internally and with members of the communities where we operate. Sustainability can no longer be a secondary consideration, which is why we’re also in the process of developing a global plan to make realistic, attainable and strong commitments to being a more sustainable organisation ourselves, while working with our partners and customers to do the same.


IDaaS explained: How it compares to IAM

IDaaS isn’t all sunshine and rainbows though, and organizations much account for some major considerations when evaluating it. If identity is truly the new perimeter, adopting IDaaS gives some level of control of your perimeter to an IDaaS service provider. This is similar to the shared responsibility model concept in cloud computing but extended further up the stack from not just infrastructure but to critical things such as identities, permissions, and access control. Some of the benefits cited in the above table can now potentially be a vice or point of contention depending on your organizational requirements and security sensitivity. Since you are consuming the application and system associated with IAM, you now are limited to the permissions the providers offering includes and likely have limited ability to alter the way the offering functions. This is due to the reality that the IDaaS provider offers their interface/application to many customers and can only have so much customization without losing the ability to have a standardized offering. 


Building a learning culture with AI

The first element is an AI model, which uses both internal and external data to assess competencies against a core skill set we are seeking to assess and develop; e.g., a full-stack engineer. It compares employees’ skill sets to someone in a similar role or title in the external marketplace on a scale of 1 to 5. We also pull in internal data sources, such as Jira and Workday, which contain information from their resumes, for example. That helps strengthen the accuracy and correlation of the model. The second element used to assess skill sets is an employee self-assessment. Employees receive the results of the AI model, and they validate whether they believe their skills are in line with the AI assessment. The final prong is the manager assessment, in which the manager rates the skills of that individual employee. This approach to assessing skill sets has been valuable for several reasons. First, it ensures the use of objective information in the evaluation process, reducing the influence of subjective views that managers may have, based on limited interactions with employees.



Quote for the day:

"One of the sad truths about leadership is that, the higher up the ladder you travel, the less you know." -- Margaret Heffernan

Daily Tech Digest - December 06, 2021

Why Qualcomm believes its new always-on camera for phones isn’t a security risk

Judd Heap, VP of Product Management at Qualcomm’s Camera, Computer Vision and Video departments, told TechRadar, “The always-on aspect is frankly going to scare some people so we wanted to do this responsibly. “The low power aspect where the camera is always looking for a face happens without ever leaving the Sensing Hub. All of the AI and the image processing is done in that block, and that data is not even exportable to DRAM. “We took great care to make sure that no-one can grab that data and so someone can’t watch you through your phone.” This means the data from the always-on camera won’t be usable by other apps on your phone or sent to the cloud. It should stick in this one area of the phone’s chipset - that’s what Heap is referring to as the Sensing Hub - for detecting your face. Heap continues, “We added this specific hardware to the Sensing Hub as we believe it’s the next step in the always-on body of functions that need to be on the chip. We’re already listening, so we thought the camera would be the next logical step.”


The HaloDoc Chaos Engineering Journey

The platform is composed of several microservices hosted across hybrid infrastructure elements, mainly on a managed Kubernetes cloud, with an intricately designed communication framework. We also leverage AWS cloud services such as RDS, Lambda and S3, and consume a significant suite of open source tooling, especially from the Cloud Native Computing Foundation landscape, to support the core services. As the architect and manager of site reliability engineering (SRE) at HaloDoc, ensuring smooth functioning of these services is my core responsibility. In this post, I’d like to provide a quick snapshot of why and how we use chaos engineering as one of the means to maintain resilience. While operating a platform of such scale and churn (newer services are onboarded quite frequently), one is bound to encounter some jittery situations. We had a few incidents with newly added services going down that, despite being immediately mitigated, caused concern for our team. In a system with the kind of dependencies we had, it was necessary to test and measure service availability across a host of failure scenarios.


Zero trust, cloud security pushing CISA to rethink its approach to cyber services

“When agencies hear the IG say something about how things are going with FISMA, they really pay attention. If we’re in a position to help influence that in a positive way, it’s absolutely critical that we do so,” he said. “We’ve got to pare down what we’re spending on IT and really focus on those things that matter. We have to adjust to a risk management approach in terms of how we apply architecture and capabilities across the enterprise to support the varying degrees of risk that we can absorb or manage within the within a given agency network. That’s like a huge part of what we need to continue to advocate for. But, to me, that is a significant element of the culture shift that needs to happen.” One way CISA is going to drive some of the culture and technology changes to help agencies achieve a zero trust environment is through the continuous diagnostics and mitigation program. CISA released a request for information for endpoint detection and response capabilities in October that vendors under the CDM program will implement for agencies.


DeFi’s Decentralization Is an Illusion: BIS Quarterly Review

“The decentralised nature of DeFi raises the question of how to implement any policy provisions,” the report said. “We argue that full decentralisation in DeFi is an illusion.” One element that could break this illusion is DeFi’s governance tokens, which are cryptocurrencies that represent voting power in decentralized systems, according to the report. Governance-token holders can influence a DeFi project by voting on proposals or changes to the governance system. These governing bodies are called decentralised autonomous organizations (DAO) and each one can oversee multiple DeFi projects. “This element of centralisation can serve as the basis for recognising DeFi platforms as legal entities similar to corporations,” the report said. It gave an example of how DAOs can register as limited liability companies in the state of Wyoming. “These groups, and the governance protocols on which their interactions are based, are the natural entry points for policymakers,” the report said. During Monday’s briefing, Shin explained that there are three areas regulators could address through these centralized organizational bodies.


This New Ultra-Compact Camera Is The Size of a Grain of Salt And Takes Stunning Photos

Using a technology known as a metasurface, which is covered with 1.6 million cylindrical posts, the camera is able to capture full-color photos that are as good as images snapped by conventional lenses some half a million times bigger than this particular camera. And the super-small contraption has the potential to be helpful in a whole range of scenarios, from helping miniature soft robots explore the world, to giving experts a better idea of what's going on deep inside the human body. "It's been a challenge to design and configure these little microstructures to do what you want," says computer scientist Ethan Tseng from Princeton University in New Jersey. ... One of the camera's special tricks is the way it combines hardware with computational processing to improve the captured image: Signal processing algorithms use machine learning techniques to reduce blur and other distortions that otherwise occur with cameras this size. The camera effectively uses software to improve its vision.


Top Internet of Things (IoT) Trends for 2022: The Future of IoT

Hyperconnectivity and ultra-low latency are necessary to power successful IoT solutions. 5G is the connectivity that will make more widespread IoT access possible. Currently, cellular companies and other enterprises are working to make 5G technology available in their areas to support further IoT development. Bjorn Andersson, senior director of global IoT marketing at Hitachi Vantara, an IT service management and top-performing IoT company, explained why the next wave of wider 5G access will make all the difference for new IoT use cases and efficiencies. “With commercial 5G networks already live worldwide, the next wave of 5G expansion will allow organizations to digitalize with more mobility, flexibility, reliability, and security,” Andersson said. “Manufacturing plants today must often hardwire all their machines, as Wi-Fi lacks the necessary reliability, bandwidth, or security. “5G delivers the best of two worlds: the flexibility of wireless with the reliability, performance, and security of wires. 5G is creating a tipping point. 


Zero Trust: Time to Get Rid of Your VPN

OAuth and OpenID Connect (OIDC) are standards that enable a token-based architecture, a pattern that fits exceptionally well with a ZTA. In fact, you could argue that zero trust architecture is a token-based architecture. So, how does a token-based architecture work? First, it determines who the user is or what system or service is requesting access. Then, it issues an access token. The token itself will contain different claims, depending on the resource that is being requested as well as contextual information. The claims given in the token can, for example, be determined by a policy engine such as Open Policy Agent (OPA). A policy describes the allowed access and which claims are needed to access certain resources. In the context of the access request, the token service can issue a token with appropriate claims based on that defined policy. Resources that are being accessed need to verify the identity. In modern architectures, this is typically some type of API. When the request to the API is received, the API validates the access token sent with the request. 


Breaking Up a Monolithic Database with Kong

The RESTful API software style provides an easy manner for client applications to gain access to the resources (data) they need to meet business needs. In fact, it did not take long for Javascript-based frameworks like Angular, React, and Vue to rely on RESTful APIs and lead the market for web-based applications. This pattern of RESTful service APIs and frontend Javascript frameworks sparked a desire for many organizations to fund projects migrating away from monolithic or outdated applications. The RESTful API pattern also provided a much-needed boost in the technology economy which was still recovering from the impact of the Great Recession. ... My recommended approach is to isolate a given microservice with a dedicated database. This allows the count and size of the related components to match user demand while avoiding additional costs for elements that do not have the same levels of demand. Database administrators are quick to defend the single-database design by noting the benefits that constraints and relationships can provide when all of the elements of the application reside in a single database.


Securing identities for the digital supply chain

As the world becomes more connected, governing and securing digital certificates is a business essential. As certificates’ lifespans continue to shrink, enterprises need to deploy ever more into their digital infrastructure. With greater numbers of certificates entering an organisations’ cyber space, there is more room for dangerous expirations to go unnoticed. From business-ending outages to crippling cyber attacks, the potential downside to bad management of this vital utility is huge. Unfortunately, digital certificates are still woefully mismanaged by businesses and governments world-wide. The volume of certificates being used to secure digital identities is growing exponentially, and businesses are faced with new management challenges that can’t be solved with legacy certificate automation models or outdated on-premises solutions. ... Today’s digital-first enterprise requires a modern approach to managing the exponential growth of certificates, regardless of the issuing certificate authority (CA), and one built to work within today’s complex zero trust IT infrastructure.


Lightweight External Business Rules

Traditional rule engines that enable Domain-experts to author rule sets and behaviors outside the codebase, are highly useful for a complex and large business landscape. But for smaller and less complex systems, they often turn out to be overkill and remain underutilised given the recurring cost of an on-premises or Cloud infrastructure they run on, License cost, etc. For a small team, adding any component requiring an additional skill set is a waste of its bandwidth. Some of the commercial rule engines have steep learning curves. In this article, we attempt to illustrate how we succeeded in maintaining rules outside source code to execute a medium scale system running on Java tech-stack like Spring Boot, making it easier for other users to customize these rules. This approach is suitable for a team that cannot afford a dedicated rule engine, its infrastructure, maintenance , recurring cost etc. and its domain experts have a foundation of Software or people within the team wear multiple hats.



Quote for the day:

"Coaching is unlocking a person's potential to maximize their own performance. It is helping them to learn rather than teaching them." -- John Whitmore

Daily Tech Digest - February 14, 2021

Outsmarting ML Biases: A Checklist

Machine learning algorithms relentlessly search for a solution. In the case of GANs, the generator and discriminator network somehow finds a way to fool each other. The result is a Deepfake. Not that deep fakes are harmless but ML is used in more critical industries such as healthcare. So when a model that is fed with an underrepresented dataset is used, the chances of misdiagnosis increases. “Each ML algorithm has a strategy to answer optimally to your question,” warned Luca. ... The different definitions makes things even more cumbersome for the data scientist. Citing the work on the impossibility of fairness, Luca also explained why some notions of fairness are mutually incompatible and cannot be satisfied simultaneously. “ There is no single universal metric for quantifying fairness that can be applied to all ML problems,” he added. No matter how fool proof the data curation process is, loopholes might creep in. So, what are these loopholes? ... When it comes to ML fairness toolkits, Google’s TensorFlow team has been on the top. The team has been developing multiple tools to assist niche areas within the realms of fairness debate. The whole debate around ML fairness is forcing companies like Google to establish an ecosystem of fairer ML practice through their tools. 


Visual Studio Code comes to Raspberry Pi

There are already some great editors, but nothing of the calibre of VS Code. I can take my $35 computer, plug it into a keyboard and mouse, connect a monitor and a TV and code in a wide range of languages from the same place. I see kids learning Python at school using one tool, then learning web development in an after-school coding club with a different tool. They can now do both in the same application, reducing the cognitive load – they only have to learn one tool, one debugger, one setup. Combine this with the new Raspberry Pi 400 and you have an all-in-one solution to learning to code, reminiscent of my ZX Spectrum of decades ago, but so much more powerful. The second reason is to me the most important — it allows kids to share the same development environment as their grown-ups. Imagine the joy of a 10-year-old coding Python using VS Code on their Raspberry Pi plugged into the family TV, then seeing their Mum working from home coding Python in exactly the same tool on her work laptop as part of her job as an AI engineer or data scientist. It also makes it easier when Mum has to inevitably help with unblocking the issues that always come up with learners.


This new open source tool could improve data quality within the enterprise

While Soda SQL is more geared toward data engineers, Soda also offers a hosted service geared toward the business user and, specifically, the chief data officer (CDO). Interest in data testing and monitoring might start with the CDO when they recognize the need to ensure quality data feeding executive dashboards, machine learning models, and more. At the same time, data engineers, responsible for building data pipelines (transforming, extracting, and preparing data for usage), just need to do some minimal checks to ensure they're not shipping faulty data. Or, you might have a data platform engineer who just wants hands-off monitoring after connecting to the data platform warehouse. In this universe, data testing and data monitoring are two distinct things. In both cases, Baeyens said, "The large majority of people with which we speak have an uncomfortable feeling that they should be doing more with data validation, data testing, and monitoring, but they don't know where to start, or it's just kind of blurry for them." Soda is trying to democratize data monitoring, in particular, by making it easy for non-technical, business-oriented people to build the data monitors.


Cybersecurity is still the #1 risk for manufacturers

We see lots of incidents, but there’s no obligation for the owners and operators to disclose the incident. The incidents that you see in the media are often just a small percentage of the incidents that you actually see in the public eye. We know of many serious incidents that you’ll never read in the headlines and for good reason, really. So, what I would do is say that cybersecurity is still a priority for many organizations. It’s their number one risk, and it’s something that they’re dealing with every day. ... Ask the question, “What is the problem that I’d like to solve, as a result of implementing digital where any other solution couldn’t?” If you’re already on that journey, I would be looking back and reviewing and saying, “Does my digital solution so far answer the question? Is it solving the problem that I want to solve as a result of a digital solution?” In a recent study, we found that less than 20% of organizations have more than a third of the employees actually trained in digital, and trained in their digital strategy as an organization. But, more than 60% of our customers actually have a digital strategy, so there’s a mismatch between customers in heading out on the digital journey, but not really taking their employees with them.


Keeping control of data in the digital supply chain

While organisations will never have as much control over a supplier’s security as they do their own, they can take steps to minimise risks. Security standards must be set out within service level agreements (SLAs), for instance, insisting that the third-party meets ISO 27001 accreditation as a minimum and ensuring that the supplier has a framework of policies and procedures governing information risk management processes. Unfortunately, this approach is rare. The UK Government’s Data Breaches Survey 2019 indicates that less than one in five businesses (18%) demanded that their suppliers have any form of cybersecurity standard or good practice guidelines in place. The issue also becomes more complicated when the sheer scale and intricacy of the average supply chain network comes into play. A firm may have its data stolen from a company three or four connections deep into the supply chain. If the breached third-party lacks the ability to detect an attack itself, a company’s data could be in the hands of criminals for months before they are finally alerted to the breach. Even if a security breach originates with a third party, it will carry just as much of a financial and reputational cost as a direct attack on the organisation’s own network.


Metaethics, Meta-Intelligence And The Rise Of AI

The notion of ethics has evolved. Decisions around right and wrong always depended on human cognition and were guided by popular sentiments and socially acceptable norms. Now, with the rise of AI, machines are slowly taking over human cognition functions, a phenomenon that author Ray Kurzweil predicts will increase over time and culminate in the advent of singularity where machines irrevocably take over humans, possibly at some distant point in the future. This trend is causing technologists, researchers, policymakers and society at large to rethink how we interpret and implement ethics in the age of AI. ... To face the challenges of the future, we also need to develop a new discipline of meta-intelligence by taking inspiration from the concepts of metadata and metaethics. Doing so will help us improve the traceability and trustworthiness of AI-driven insights. The concept of meta-intelligence has been doing the rounds of thought leadership for the last few years, especially led by people thinking about and working on singularity. The pace of technological evolution and the rise of AI has become essential for human progress today. Businesses around the world are getting impacted by the transformative power of these technologies.


Qualcomm's new X65 5G modem downloads data at lightning-fast 10Gbps speeds

With the X65, unveiled Tuesday, users will get a bump in speed but also see better battery life. Coverage will improve, latency will decrease and applications will be even more responsive than they are with Qualcomm's earlier X60 modem technology. And capacity will be "massive," letting more people on a network make reliable and crisp video calls with their doctors and face off against rivals in streaming games. With the previous-generation X60 modem, just now arriving in smartphones like Samsung's Galaxy S21, you can download data over 5G networks at up to 7.5Gbps and upload information as fast as 3Gbps, only slightly faster than the previous generation of modem. But the X60 also has the ability to aggregate the slower but more reliable sub-6 networks with the faster but finicky millimeter-wave spectrum, boosting overall performance and helping users see faster average speeds. The X65 has the same benefit. While it's unlikely that you'll regularly -- or maybe even ever -- see 10Gbps download speeds, you'll consistently see speeds that are magnitudes faster than your current 4G smartphone.


Using NGINX to Serve .NET Core, Nodejs, or Static Contents

NGINX is a high-performance HTTP server as well as a reverse proxy. Unlike traditional servers, NGINX follows an event-driven, asynchronous architecture. As a result, the memory footprint is low and performance is high. If you’re running a Node.js-based web app or .NET Core Web Application, you should seriously consider using NGINX as a reverse proxy. NGINX can be very efficient in serving static assets as well. For all other requests, it will talk to your Node.js back end or .NET Core Web Application and send the response to the client. ... Although the focus of this article is NGINX. But we will be dealing with a little bit of bash commands, NodeJS, and .NET Core. I have written about all of these topics on DZone, so you check my other articles for background information on these topics if needed. ... A reverse proxy server is a web server that accepts requests and sends them to another web server which actually creates the responses for those requests. The responses are sent back to the proxy server who forwards them to the clients who issued the corresponding requests. Nginx is a web server that can act as a reverse proxy for ASP.NET Core applications and which is also very good at serving static content.


To succeed in an AI world, students must learn the human traits of writing

AI cannot yet plan and does not have a purpose. Students need to hone skills in purposeful writing that achieves their communication goals. Unfortunately, the NAPLAN regime has hampered teaching writing as a process that involves planning and editing. This is because it favours time-limited exam-style writing for no audience. Students need to practise writing in which they are invested, that they care about and that they hope will effect change in the world as well as in their genuine, known readers. This is what machines cannot do. AI is not yet as complex as the human brain. Humans detect humour and satire. They know words can have multiple and subtle meanings. Humans are capable of perception and insight; they can make advanced evaluative judgements about good and bad writing. There are calls for humans to become expert in sophisticated forms of writing and in editing writing created by robots as vital future skills. Nor does AI have a moral compass. It does not care. OpenAI’s managers originally refused to release GPT-3, ostensibly because they were concerned about the generator being used to create fake material, such as reviews of products or election-related commentary.


Living, and Breathing Data Governance, Security, and Regulations

A top-down approach to building data and analytics platforms, based on data governance best practices and policies, is often the choice. This approach can provide a cohesive and robust solution that complies well with privacy regulations, and where all the components interact well, adhering to strict security policies. Unfortunately, it can often become cumbersome for users and slow the time-to-value, with data consumers forced to adapt their data usage and consumption to the strict compliance and security-driven protocols driving the platform. On the flip side, a bottom-up approach to data analytics is engineering and design-focused, with the goal of introducing incremental deliverables that add value to the platform in response to the user’s needs. ... Whether top-down or bottom-up, it’s critical for organizations to start with documenting privacy, security, data risks, controls, and technology needs around data access to address topics like culture of federated data ownership, adoption of self-service or collaboration across teams around critical data sets, and enterprise-wide technology standards for certain key areas.



Quote for the day:

“Believe in your infinite potential. Your only limitations are those you set upon yourself.” -- Roy T. Bennett