Showing posts with label computer vision. Show all posts
Showing posts with label computer vision. Show all posts

Daily Tech Digest - July 01, 2024

The dangers of voice fraud: We can’t detect what we can’t see

The inherent imperfections in audio offer a veil of anonymity to voice manipulations. A slightly robotic tone or a static-laden voice message can easily be dismissed as a technical glitch rather than an attempt at fraud. This makes voice fraud not only effective but also remarkably insidious. Imagine receiving a phone call from a loved one’s number telling you they are in trouble and asking for help. The voice might sound a bit off, but you attribute this to the wind or a bad line. The emotional urgency of the call might compel you to act before you think to verify its authenticity. Herein lies the danger: Voice fraud preys on our readiness to ignore minor audio discrepancies, which are commonplace in everyday phone use. Video, on the other hand, provides visual cues. There are clear giveaways in small details like hairlines or facial expressions that even the most sophisticated fraudsters have not been able to get past the human eye. On a voice call, those warnings are not available. That’s one reason most mobile operators, including T-Mobile, Verizon and others, make free services available to block — or at least identify and warn of — suspected scam calls.


Provider or partner? IT leaders rethink vendor relationships for value

Vendors achieve partner status in McDaniel’s eyes by consistently demonstrating accountability and integrity; getting ahead of potential issues to ensure there’s no interruptions or problems with the provided products or services; and understanding his operations and objectives. ... McDaniel, other CIOs, and CIO consultants agree that IT leaders don’t need to cultivate partnerships with every vendor; many, if not most, can remain as straight-out suppliers, where the relationship is strictly transactional, fixed-fee, or fee-for-service based. That’s not to suggest those relationships can’t be chummy, but a good personal rapport between the IT team and the supplier’s team is not what partnership is about. A provider-turned-partner is one that gets to know the CIO’s vision and brings to the table ways to get there together, Bouryng says. ... As such, a true partner is also willing to say no to proposed work that could take the pair down an unproductive path. It’s a sign, Bouryng says, that the vendor is more interested in reaching a successful outcome than merely scheduling work to do.


In the AI era, data is gold. And these companies are striking it rich

AI vendors have, sometimes controversially, made deals with organizations like news publishers, social media companies, and photo banks to license data for building general-purpose AI models. But businesses can also benefit from using their own data to train and enhance AI to assist employees and customers. Examples of source material can include sales email threads, historical financial reports, geographic data, product images, legal documents, company web forum posts, and recordings of customer service calls. “The amount of knowledge—actionable information and content—that those sources contain, and the applications you can build on top of them, is really just mindboggling,” says Edo Liberty, founder and CEO of Pinecone, which builds vector database software. Vector databases store documents or other files as numeric representations that can be readily mathematically compared to one another. That’s used to quickly surface relevant material in searches, group together similar files, and feed recommendations of content or products based on past interests. 


Machine Vision: The Key To Unleashing Automation's Full Potential

Machine vision is a class of technologies that process information from visual inputs such as images, documents, computer screens, videos and more. Its value in automation lies in its ability to capture and process large quantities of documents, images and video quickly and efficiently in quantities and speeds far in excess of human capability. ... Machine vision based technologies are even becoming central to the creation of automations themselves. For example, instead of relying on human workers to describe processes that are being automated when designing automations, recordings of the process to be automated are created and then machine vision software, combined with other technologies, is used to capture the process end-to-end and then provide the input to automating a lot of the work needed to program the digital workers (bots). ... Machine vision is integral to maximizing the impact of advanced automation technologies on business operations and paving the way for increased capabilities in the automation space.


Put away your credit cards — soon you might be paying with your face

Biometric purchases using facial recognition are beginning to gain some traction. The restaurant CaliExpress by Flippy, a fully automated fast-food restaurant, is an early adopter. Whole Food stores offer pay-by-palm, an alternative biometric to facial recognition. Given that they are already using biometrics, facial recognition is likely to be available in their stores at some point in the future. ... Just as credit and debit cards have overtaken cash as the dominant means to make purchases, biometrics like facial recognition could eventually become the dominant way to make purchases. There will however be actual costs during such a transition, which will largely be absorbed by consumers in higher prices. The technology software and hardware required to implement such systems will be costly, pushing it out of reach for many small- and medium-size businesses. However, as facial recognition systems become more efficient and reliable, and losses from theft are reduced, an equilibrium will be achieved that will make such additional costs more modest and manageable to absorb.


Technologists must be ready to seize new opportunities

For technologists, this new dynamic represents a profound (and daunting) change. They’re being asked to report on application performance in a more business-focussed, strategic way and to engage in conversations around experience at a business level. They’re operating outside their comfort zone, far beyond the technical reporting and discussions they’ve previously encountered. Of course, technologists are used to rising to a challenge and pivoting to meet the changing needs of their organisations and their senior leaders. We saw this during the pandemic, many will (rightly) be excited about the opportunity to expand their skills and knowledge, and to elevate their standing within their organisations. The challenge that many technologists face, however, is that they currently don’t have the tools and insights they need to operate in a strategic manner. Many don’t have full visibility across their hybrid environments and they’re struggling to manage and optimise application availability, performance and security in an effective and sustainable manner. They can’t easily detect issues, and even when they do, it is incredibly difficult to quickly understand root causes and dependencies in order to fix issues before they impact end user experience. 


Vulnerability management empowered by AI

Using AI will take vulnerability management to the next level. AI not only reduces analysis time but also effectively identifies threats. ... AI-driven systems can identify patterns and anomalies that signify potential vulnerabilities or attacks. Converting the logs into data and charts will make analysis simpler and quicker. Incidents should be identified based on the security risk, and notification should take place for immediate action. Self-learning is another area where AI can be trained with data. This will enable AI to be up-to-date on the changing environment and capable of addressing new and emerging threats. AI will identify high-risk threats and previously unseen threats. Implementing AI requires iterations to train the model, which may be time-consuming. But over time, it becomes easier to identify threats and flaws. AI-driven platforms constantly gather insights from data, adjusting to shifting landscapes and emerging risks. As they progress, they enhance their precision and efficacy in pinpointing weaknesses and offering practical guidance.


Why every company needs a DDoS response plan

Given the rising number of DDoS attacks each year and the reality that DDoS attacks are frequently used in more sophisticated hacking attempts to apply maximum pressure on victims, a DDoS response plan should be included in every company’s cybersecurity tool kit. After all, it’s not just a temporary lack of access to a website or application that is at risk. A business’s failure to withstand a DDoS attack and rapidly recover can result in loss of revenue, compliance failures, and impacts on brand reputation and public perception. Successful handling of a DDoS attack depends entirely on a company’s preparedness and execution of existing plans. Like any business continuity strategy, a DDoS response plan should be a living document that is tested and refined over the years. It should, at the highest level, consist of five stages, including preparation, detection, classification, reaction, and postmortem reflection. Each phase informs the next, and the cycle improves with each iteration.


Reduce security risk with 3 edge-securing steps

Over the past several years web-based SSL VPNs have been targeted and used to gain remote access. You may even want to consider evaluating how your firm allows remote access and how often your VPN solution has been attacked or at risk. ... “The severity of the vulnerabilities and the repeated exploitation of this type of vulnerability by actors means that NCSC recommends replacing solutions for secure remote access that use SSL/TLS with more secure alternatives,” the authority says. “The NCSC recommends internet protocol security (IPsec) with internet key exchange (IKEv2). Other countries’ authorities have recommended the same.” ... Pay extra attention to how credentials that need to be accessed are protected from unauthorized access. Ensure that you use best practice processes to secure passwords and ensure that each user has appropriate passwords and access accordingly. ... When using cloud services, you need to ensure that only those vendors you trust or that you have thoroughly vetted have access to your cloud services. 

The real key to machine learning success is something that is mostly missing from genAI: the constant tuning of the model. “In ML and AI engineering,” Shankar writes, “teams often expect too high of accuracy or alignment with their expectations from an AI application right after it’s launched, and often don’t build out the infrastructure to continually inspect data, incorporate new tests, and improve the end-to-end system.” It’s all the work that happens before and after the prompt, in other words, that delivers success. For genAI applications, partly because of how fast it is to get started, much of this discipline is lost. ... As with software development, where the hardest work isn’t coding but rather figuring out which code to write, the hardest thing in AI is figuring out how or if to apply AI. When simple rules need to yield to more complicated rules, Valdarrama suggests switching to a simple model. Note the continued stress on “simple.” As he says, “simplicity always wins” and should dictate decisions until more complicated models are absolutely necessary.



Quote for the day:

“The vision must be followed by the venture. It is not enough to stare up the steps - we must step up the stairs.” -- Vance Havner

Daily Tech Digest - December 21, 2022

The Cybersecurity Industry Doesn't Have a Stress Problem — It Has a Leadership Problem

Many of the cybersecurity issues raised in the CIISec survey point to a need for strong leadership that proactively identifies and resolves issues. But cybersecurity teams need servant leaders, not those who lead by establishing command and control structures. Servant leaders create authority by — you guessed it — serving their employees. Cybersecurity executives of this ilk are concerned about the well-being of the team, regularly checking in with team members on how they are doing, and removing roadblocks that harm operational performance. They'll go to bat with upper management to get an increased budget for new tools and additional staff to smooth out workloads for teams. Servant leaders take turns serving on call to understand work conditions from analysts' perspectives and hold regular team meetings to discuss key trends and issues. They're also likely to look ahead to anticipate market and business developments and reposition their organization to get ready to meet them. As a result, these leaders' teams feel supported. Analysts are not afraid to share problems or new ideas, as they know their leaders will listen, consider them carefully and, most importantly, respond.


Cybersecurity: What is Changing and What Isn’t

A lot of things have changed, but a lot remain the same. Adversaries have gotten smarter, so defense has had to do the same. Every piece of technology has a computer embedded in it nowadays – cars, fridges, thermostats, cameras, speakers, and of course, the ubiquitous mobile phones – resulting in a vastly increased attack surface, and the need for trained professionals to protect this Internet of Things (IoT). The general migration to the cloud has also encouraged the growth of professionals seeking to protect data outside the confines of on-prem systems. However, some core tenets still hold true – restricting user access, limiting system functionality, backing up critical data, planning for disruptions, and of course, security awareness training. Even the best of security controls can be overcome by a user clicking on the wrong link (phishing), visiting the wrong website (drive-by download), connecting to the wrong network (rogue access point), opening the wrong attachment (malicious macro), letting in the wrong person in a secured area (tailgating), or just simply, disclosing the right information to the wrong person (vishing).


Intro to the Observable design pattern

The Observable design pattern is used in many important Java APIs. One well-known example is a JButton that uses the ActionListener API to execute an action. In this example, we have an ActionListener listening or observing on the button. When the button is clicked, the ActionListener performs an action. The Observable pattern is also used with reactive programming. The use of observers in reactive applications makes sense because the essence of reactive is reaction: something happens when another process occurs. Observable is a behavioral design pattern. Its function is to perform an action when an event happens. Two common examples are button clicks and notifications, but there are many more uses for this pattern. ... By using the Observable pattern, the notification would happen only once to all of your subscribers. It's a huge performance gain as well as being an effective code optimization. This code can easily be extended or changed. The reactive programming paradigm uses the Observable pattern everywhere. If you ever worked with Angular, then you will know that using Observable components is very common. 


How to Embed Gen Z in Your Organization’s Security Culture

Providing the most cutting-edge instruction will engage Gen Zers and provide them with meaningful security best practices for work and home. The threat landscape is more dangerous than it was when Gen Zers were coming of age. Current threats extend beyond traditional scams. They may be lurking in the unsecured WiFi available at a coffee shop. All the threat actor needs is someone desperate for free internet and tired of clicking checkboxes. With that ever-changing threat landscape in mind, your organization’s security program needs the resilience to adapt. The IBM Security X-Force Cyber Range provides a variety of experiences to prepare organizations for a cyber incident. The team can also cater content to different audiences, such as the C-suite or the board of directors. Gen Z may not be a part of those groups yet, but the X-Force Cyber Range offers a range of experiences for professionals at all levels. The X-Force Cyber Range team tailors immersive experiences to your organization’s industry and context to provide the most realistic scenario. 


Intelligence and Efficiency Will Guide Unstructured Data Management in 2023

Smarter edge data management will avoid overspending on storing extraneous data in cloud data lakes and warehouses by filtering and deleting non-valuable data at the edge first. Edge analytics tools will quickly process the data without the need to send large files back and forth to cloud or on-premises data centers, saving time and money. The right edge analytics and data management program can deliver real-time insights to improve customer experiences or detect issues quickly, such as a manufacturing defect or a ransomware breach. ... Storage and IT managers will need to prepare by getting full visibility into data across silos, understanding data characteristics and metadata to enable rapid classification and search, and then moving it into the optimal storage tier to feed the data lake and analytics platforms preferred by their end users. IT will need to work closely with stakeholders from security, legal, data governance, research, and data science teams, as well as business unit leaders, to fulfill the requirements of new, unstructured data analytics programs.


The FBI is worried about a wave of cyber crime against America’s small businesses

Small and medium-sized businesses face a big threat from cyberattacks and hackers, according to a special agent in the FBI’s cyber division. “The large businesses continue to invest in their cybersecurity and enhance their cybersecurity posture,” FBI Supervisory Special Agent Michael Sohn said at CNBC’s Small Business Playbook virtual event on Wednesday. “So what the cybercriminals are doing is they’re pivoting, they’re evolving and targeting the soft targets, which are the small and medium businesses.” In 2021, the FBI’s Internet Crime Complaint Center (IC3) received 847,376 complaints from the American public regarding cyberattacks and malicious cyber activity, a 7% year-over-year increase. In total, potential losses from those attacks exceed $6.9 billion, a 64% increase compared to the previous year. “Unfortunately, the majority of those victims were small businesses,” Sohn told CNBC’s Frank Holland. But even as small businesses are increasingly being targeted by hackers and cyber criminals, CNBC and SurveyMonkey data has shown that most small business owners are not concerned.


Healthcare: Essential Defenses for Combating Ransomware

From a defensive standpoint, Siegel says organizations can employ a long list of tactics. Leading up to ransomware, the biggest weakness he sees is a cultural issue, centered on failing to take the risk seriously and make appropriate investments to prevent such incidents. "These are the times we live in, and it's just the cost of doing business," he says. "You have to make these investments." Ransomware attackers gain remote access to a victim's network and typically linger, studying the network and gaining greater access, before deploying crypto-locking malware. Thus, it's imperative to spot those activities before files start getting encrypted. "Most groups now will also want to steal large amounts of data before they launch the ransomware, and then they'll actually plan out how they're going to deploy the ransomware to all of your servers, all of your machines or whichever ones they choose," says Peter Mackenzie, director of incident response at Sophos. "That's not something that happens instantly. That can take days or weeks of preparation."


Engineering AI-Enabled Computer Vision Systems: Lessons From Manufacturing

While traditional non-AI software acts as a tool to execute preset rules, an AI-enabled system makes decisions based on (past) data and probabilistic outcomes, which constitutes a paradigm shift—especially within traditional manufacturing organizations. Therefore, proven software development approaches need to be extended to build and further evolve systems that contain ML components.13 One example is DevOps, which needs to be extended into DataOps or MLOps when developing AI solutions to meet specific requirements of handling the everchanging data. Engineering AI-enabled computer vision systems goes beyond merely building AI algorithms. To build industrial solutions, these AI algorithms need to be embedded into grown-up software products which also poses novel challenges for software engineers. To provide an overview of challenges and success factors in engineering AI-enabled computer vision systems, we analyzed corresponding manufacturing use cases, shadowed project meetings, and incorporated our own expertise.


IT Industry Outlook 2023: Trends Likely to Impact the Industry and Tech Pros

Employers are no longer restricted to hiring candidates that are within a commutable distance of local offices, giving job hunters an opportunity to apply for roles that may not have been open to them previously. “I believe with the continued prevalence of remote working, hiring decisions will become less based on culture fit and similar criteria, and more focused on skills and performance,” Finnigan says. “This will open the door to a much more globally diverse workforce, provided skills gaps continue to close.” ... Replacing early interview screenings with skills-based assessments that mimic a company's tech stack allows hiring managers to assess candidates’ compatibility quickly and accurately, moving only the best through the pipeline. “With this approach, hiring managers can spend more time with candidates who are truly qualified, which can lead to a more accurate decision and a faster time-to-hire,” Finnigan says. Westfall says that smaller organizations may be able to offer IT pros looking for a change of pace an assortment of unique perks, as well as a close-knit company culture and a greater impact on local communities.


APIs are placing your enterprise at risk

Stolen API keys are the culprit behind some of the largest cyberattacks to date. We see the headlines and we read the news stories, but we often fail to realize the broad consequences – particularly the notable impacts on enterprise mobile security. Consider the news earlier this year of 3,000+ mobile applications leaking Twitter’s API keys, meaning bad actors could compromise thousands of individual accounts and conduct a slew of nefarious activities. Imagine if this was your company and the role was reversed and hundreds or even thousands of mobile applications were leaking the API keys to your corporate Gmail, Slack or OneDrive accounts. If this or similar scenarios were to happen, employee devices and sensitive company data would be at extreme risk. The recent push to focus on API security comes at a critical time where more enterprises are relying on enterprise mobility, meaning increasing a reliance on mobile app connectivity. A recent survey of US and UK-based security directors and mobile applications developers found that 74% of respondents felt mobile apps were critical to business success.



Quote for the day:

"Make heroes out of the employees who personify what you want to see in the organization." -- Anita Roddick

Daily Tech Digest - August 28, 2022

How to build a winning analytics team

Analytics teams thrive in dynamic environments that reward curiosity, encourage innovation, and set high expectations. Building and reinforcing this type of culture can help put organizations on a path to earning impressive returns from analytics investments. An active analytics culture thrives when CXOs reward curiosity over perfection. Encourage analysts to challenge convention and ask questions as a method to improve quality and reduce risks. This thinking goes hand in hand with a test-and-learn mentality, where pushing boundaries through proactive experimentation helps identify what works, and optimize accordingly. It’s also important to create a culture where failure and success are celebrated equally. Giving airtime to what went wrong allows the team to more effectively learn from their mistakes and see that perfection is an unhealthy pipe dream. This encourages an environment that holds analysts accountable for delivering quality processes and results, further helping to mitigate risk and improve marketing programs.


How SSE Renewables uses Azure Digital Twins for more than machines

This approach will allow SSE to experiment with reducing risks to migrating birds. For example, they can determine an optimum blade speed that will allow flocks to pass safely while still generating power. By understanding the environment around the turbines, it will be possible to control them more effectively and with significantly less environmental impact. Simon Turner, chief technology officer for data and AI at Avanade, described this approach as “an autonomic business.” Here, data and AI work together to deliver a system that is effectively self-operating, one he described as using AI to “look after certain things that you understood that could guide the system to make decisions on your behalf.” Key to this approach is extending the idea of a digital twin with machine learning and large-scale data. ... As Turner notes, this approach can be extended to more than wind farms, using it to model any complex system where adding new elements could have a significant effect, such as understanding how water catchment areas work or how hydroelectric systems can be tuned to let salmon pass unharmed on their way to traditional breeding grounds, while still generating power.


McKinsey report: Two AI trends top 2022 outlook

Roger Roberts, partner at McKinsey and one of the report’s coauthors, said of applied AI, which is defined “quite broadly” in the report, “We see things moving from advanced analytics towards… putting machine learning to work on large-scale datasets in service of solving a persistent problem in a novel way,” he said. That move is reflected in an explosion of publication around AI, not just because AI scientists are publishing more, but because people in a range of domains are using AI in their research and pushing the application of AI forward, he explained. ... According to the McKinsey report, industrializing machine learning (ML) “involves creating an interoperable stack of technical tools for automating ML and scaling up its use so that organizations can realize its full potential.” The report noted that McKinsey expects industrializing ML to spread as more companies seek to use AI for a growing number of applications. “It does encompass MLops, but it extends more fully to include the way to think of the technology stack that supports scaling, which can get down to innovations at the microprocessor level,” said Roberts. 


CISA: Prepare now for quantum computers, not when hackers use them

The main negative implication of this quantum computing concerns the cryptography of secrets, a fundamental element of information security. Cryptographic schemes that are today considered secure will be cracked in mere seconds by quantum computers, leaving persons, companies, and entire countries powerless against the computing supremacy of their adversaries. “When quantum computers reach higher levels of computing power and speed, they will be capable of breaking public key cryptography, threatening the security of business transactions, secure communications, digital signatures, and customer information,” explains CISA. This could threaten data in transit relating to top-secret communications, banking operations, military operations, government meetings, critical industrial processes, and more. Yesterday, China's Baidu introduced “Qian Shi,” an industry-level quantum supercomputer capable of achieving stable performance at 10 quantum bits of power.


How Are Business Intelligence And Data Management Related?

Business intelligence (BI) describes the procedures and tools that assist in getting helpful information and intelligence that can be used from data. A company’s data is accessed by business intelligence tools, which then display analytics and insights as reports, dashboards, graphs, summaries, and charts. Business intelligence has advanced significantly from its theoretical inception in the 1950s, and you must realize that it is not just a tool for big businesses. Most BI providers are tailoring their software to users’ needs because they recognize that our current era is considerably more oriented toward small structures like start-ups. SaaS, or software-as-a-service, vendors are incredibly guilty of this. Another issue is that it’s a more straightforward tool than it once was. It is still a professional tool; managing data is not simple, even with the most powerful technology. Nevertheless, BI has developed into something more accessible than local software, which used to require installation on every computer in the organization and may represent a sizable expenditure with the emergence of the Cloud and SaaS in the early 21st century.


Oxford scientist says greedy physicists have overhyped quantum computing

It’s unclear why Dr. Gourianov would leave big tech out of the argument entirely. There are dozens upon dozens of papers from Google and IBM alone demonstrating breakthrough after breakthrough in the field. Gourianov’s primary argument against quantum computing appears, inexplicably, to be that they won’t be very useful for cracking quantum-resistant encryption. With respect, that’s like saying we shouldn’t develop surgical scalpels because they’re practically useless against chain mail armor. Per Gourianov’s article: Shor’s algorithm has been a godsend to the quantum industry, leading to untold amounts of funding from government security agencies all over the world. However, the commonly forgotten caveat here is that there are many alternative cryptographic schemes that are not vulnerable to quantum computers. It would be far from impossible to simply replace these vulnerable schemes with so-called “quantum-secure” ones. This appears to suggest that Gourianov believes at least some physicists have pulled a bait-and-switch on governments and investors by convincing everyone that we need quantum computers for security.


Computer vision is primed for business value

In healthcare, computer vision is used extensively in diagnostics, such as in AI-powered image and video interpretation. It is also used to monitor patients for safety, and to improve healthcare operations, says Gartner analyst Tuong Nguyen. “The potential for computer vision is enormous,” he says. “It’s basically helping machines make sense of the world. The applications are infinite — really, anything you need to see. The entire world.” According to the fourth annual Optum survey on AI in healthcare, released at the end of 2021, 98% of healthcare organizations either already have an AI strategy or are planning to implement one, and 99% of healthcare leaders believe AI can be trusted for use in health care. Medical image interpretation was one of the top three areas cited by survey respondents where AI can be used to improve patient outcomes. The other two areas, virtual patient care and medical diagnosis, are also ripe for computer vision. Take, for example, idiopathic pulmonary fibrosis, a deadly lung disease that affects hundreds of thousands of people worldwide.


AI Therapy: Digital Solution to Address Mental Health Issues

AI for health has been a long-discussed topic specifically on therapy by bringing digital solutions to mental health issues. Some applications have already been, such as Genie in a Headset which manages human emotional behavior in work environments. But bringing AI into therapy means building an AI that feels and is keen to improve mental health issues. The fundamental objective of AI therapy is to assist patients in fighting mental illnesses. Ideally, this technology would be able to distinguish each patients needs and personalize their mental health programs through an efficient data collection process. ... Psychological therapy is a tough job that requires extracting confidential information from patients they hesitate to share. Like any other medical issue, it is essential to diagnose the problem before curing it. It requires exquisite skills to make someone comfortable. An AI therapist can access your cellphone, laptop, personal data, emails, all-day movement, and routine, making it more efficient in understanding you and your problems. Knowing problems in depth gives an AI-therapist advantage over the usual therapist.


What is the Microsoft Intelligent Data Platform?

The pieces that make up the Microsoft Intelligent Data Platform are services you may already be using because it includes all of Microsoft’s key data services, such as SQL Server 2022, Azure SQL, Cosmos DB, Azure Synapse, Microsoft Purview and more. But you’re probably not using them together as well as you could; the Intelligent Data Platform is here to make that easier. “These are the best-in-class services across what we consider the three core pillars of a data platform,” Mansour explained. According to Mansour, the Microsoft Intelligent Data Platform offers services for databases and operational data store, analytics, and data governance, providing authorized users with insight that will allow them to properly understand, manage and govern their business’s data. “Historically, customers have been thinking about each of those areas independent from one another, and what the Intelligent Data Platform does is bring all these pieces together,” said Mansour. Integrating databases, analytics and governance isn’t new either, but the point of presenting this as a platform is the emphasis on simplifying the experience of working with it. 


Threatening clouds: How can enterprises protect their public cloud data?

Public clouds don’t inherently impose security threats, said Gartner VP analyst Patrick Hevesi — in fact, hyperscale cloud providers usually have more security layers, people and processes in place than most organizations can afford in their own data centers. However, the biggest red flag for organizations when selecting a public cloud provider is the lack of visibility into their security measures, he said. Some of the biggest issues in recent memory: Misconfigurations of cloud storage buckets, said Hevesi. This has opened files up for data exfiltration. Some cloud providers have also had outages due to misconfigurations of identity platforms. This has affected their cloud services from starting up properly, which in turn affected tenants. Smaller cloud providers, meanwhile, have been taken offline due to distributed denial-of-service (DDoS) attacks. This is when perpetrators make a machine or network resource unavailable to intended users by disrupting services — either short-term or long-term — of a host connected to a network.



Quote for the day:

“Real integrity is doing the right thing, knowing that nobody’s going to know whether you did it or not.” -- Oprah Winfrey

Daily Tech Digest - August 27, 2022

Intel Hopes To Accelerate Data Center & Edge With A Slew Of Chips

McVeigh noted that Intel’s integrated accelerators will be complemented by the upcoming discrete GPUs. He called the Flex Series GPUs “HPC on the edge,” with their low power envelopes, and pointed to Ponte Vecchio – complete with 100 billion transistors in 47 chiplets that leverage both Intel 7 manufacturing processes as well as 5 nanometer and 7 nanometer processes from Taiwan Semiconductor Manufacturing Co – and then Rialto Bridge. Both Ponte Vecchio and Sapphire Rapids will be key components in Argonne National Labs’ Aurora exascale supercomputer, which is due to power on later this year and will deliver more than 2 exaflops of peak performance. .... “Another part of the value of the brand here is around the software unification across Xeon, where we leverage the massive amount of capabilities that are already established through decades throughout that ecosystem and bring that forward onto our GPU rapidly with oneAPI, really allowing for both the sharing of workloads across CPU and GPU effectively and to ramp the codes onto the GPU faster than if we were starting from scratch,” he said.


Performance isolation in a multi-tenant database environment

Our multi-tenant Postgres instances operate on bare metal servers in non-containerized environments. Each backend application service is considered a single tenant, where they may use one of multiple Postgres roles. Due to each cluster serving multiple tenants, all tenants share and contend for available system resources such as CPU time, memory, disk IO on each cluster machine, as well as finite database resources such as server-side Postgres connections and table locks. Each tenant has a unique workload that varies in system level resource consumption, making it impossible to enforce throttling using a global value. This has become problematic in production affecting neighboring tenants:Throughput. A tenant may issue a burst of transactions, starving shared resources from other tenants and degrading their performance. Latency: A single tenant may issue very long or expensive queries, often concurrently, such as large table scans for ETL extraction or queries with lengthy table locks. Both of these scenarios can result in degraded query execution for neighboring tenants. Their transactions may hang or take significantly longer to execute due to either reduced CPU share time, or slower disk IO operations due to many seeks from misbehaving tenant(s).


Quantum Encryption Is No More A Sci-Fi! Real-World Consequences Await

Quantum will enable enterprise customers to perform complex simulations in significantly less time than traditional software using quantum computers. Quantum algorithms are very challenging to develop, implement, and test on current Quantum computers. Quantum techniques also are being used to improve the randomness of computer-based random number generators. The world’s leading quantum scientists in the field of quantum information engineering, working to turn what was once in the realm of science fiction. Businesses need to deploy next-generation data security solutions with equally powerful protection based on the laws of quantum physics, literally fighting quantum computers with quantum encryption Quantum computers today are no longer considered to be science fiction. The main difference is that quantum encryption uses quantum bits or qubits comprised of optical photons compared to electrical binary digits or bits. Qubits can also be inextricably linked together using a phenomenon called quantum entanglement.


What Is The Difference Between Computer Vision & Image processing?

We are constantly exposed to and engaged with various visually similar objects around us. By using machine learning techniques, the discipline of AI known as computer vision enables machines to see, comprehend, and interpret the visual environment around us. It uses machine learning approaches to extract useful information from digital photos, movies, or other observable inputs by identifying patterns. Although they have the same appearance and sensation, they differ in a few ways. Computer vision aims to distinguish between, classify, and arrange images according to their distinguishing characteristics, such as size, color, etc. This is similar to how people perceive and interpret images. ... Digital image processing uses a digital computer to process digital and optical images. A computer views an image as a two-dimensional signal composed of pixels arranged in rows and columns. A digital image comprises a finite number of elements, each located in a specific place with a particular value. These so-called elements are also known as pixels, visual, and image elements.


Lessons in mismanagement

In the decades since the movie’s release, the world has become a different place in some important ways. Women are now everywhere in the world of business, which has changed irrevocably as a result. Unemployment is quite low in the United States and, by Continental standards, in Europe. Recent downturns have been greeted by large-scale stimuli from central banks, which have blunted the impact of stock market slides and even a pandemic. But it would be foolish to think that the horrendous managers and desperate salesmen of Glengarry Glen Ross exist only as historical artifacts. Mismanagement and desperation go hand in hand and are most apparent during hard times, which always come around sooner or later. By immersing us in the commercial and workplace culture of the past, movies such as Glengarry can help us understand our own business culture. But they can also help prepare us for hard times to come—and remind us how not to manage, no matter what the circumstances. ... Everyone, in every organization, has to perform. 


How the energy sector can mitigate rising cyber threats

As energy sector organisations continue expanding their connectivity to improve efficiency, they must ensure that the perimeters of their security processes keep up. Without properly secured infrastructure, no digital transformation will ever be successful, and not only internal operations, but also the data of energy users are bound to become vulnerable. But by following the above recommendations, energy companies can go a long way in keeping their infrastructure protected in the long run. This endeavour can be strengthened further by partnering with cyber security specialists like Dragos, which provides an all-in-one platform that enables real-time visualisation, protection and response against ever present threats to the organisation. These capabilities, combined with threat intelligence insights and supporting services across the industrial control system (ICS) journey, is sure to provide peace of mind and added confidence in the organisation’s security strategy. For more information on Dragos’s research around cyber threat activity targeting the European energy sector, download the Dragos European Industrial Infrastructure Cyber Threat Perspective report, here.


How to hire (and retain) Gen Z talent

The global pandemic has forever changed the way we work. The remote work model has been successful, and we’ve learned that productivity does not necessarily decrease when managers and their team members are not physically together. This has been a boon for Gen Z – a generation that grew up surrounded by technology. Creating an environment that gives IT employees the flexibility to conduct their work remotely has opened the door to a truly global workforce. Combined with the advances in digital technologies, we’ve seen a rapid and seamless transition in how employment is viewed. Digital transformation has leveled the playing field for many companies by changing requirements around where employees need to work. Innovative new technologies, from videoconferencing to IoT, have shifted the focus from an employee’s location to their ability. Because accessing information and managing vast computer networks can be done remotely, the location of workers has become a minor issue.


'Sliver' Emerges as Cobalt Strike Alternative for Malicious C2

Enterprise security teams, which over the years have honed their ability to detect the use of Cobalt Strike by adversaries, may also want to keep an eye out for "Sliver." It's an open source command-and-control (C2) framework that adversaries have increasingly begun integrating into their attack chains. "What we think is driving the trend is increased knowledge of Sliver within offensive security communities, coupled with the massive focus on Cobalt Strike [by defenders]," says Josh Hopkins, research lead at Team Cymru. "Defenders are now having more and more successes in detecting and mitigating against Cobalt Strike. So, the transition away from Cobalt Strike to frameworks like Sliver is to be expected," he says. Security researchers from Microsoft this week warned about observing nation-state actors, ransomware and extortion groups, and other threat actors using Sliver along with — or often as a replacement for — Cobalt Strike in various campaigns. Among them is DEV-0237, a financially motivated threat actor associated with the Ryuk, Conti, and Hive ransomware families; and several groups engaged in human-operated ransomware attacks, Microsoft said.


Data Management in the Era of Data Intensity

When your data is spread across multiple clouds and systems, it can introduce latency, performance, and quality problems. And bringing together data from different silos and getting those data sets to speak the same language is a time- and budget-intensive endeavor. Your existing data platforms also may prevent you from managing hybrid data processing, which, as Ventana Research explains, “enable[s] analysis of data in an operational data platform without impacting operational application performance or requiring data to be extracted to an external analytic data platform.” The firm adds that: “Hybrid data processing functionality is becoming increasingly attractive to aid the development of intelligent applications infused with personalization and artificial intelligence-driven recommendations.” Such applications are clearly important because they can be key business differentiators and enable you to disrupt a sector. However, if you are grappling with siloed systems and data and legacy technology that is unable to ingest high volumes of complex data fast so that you can act in the moment, you may believe that it is impossible for your business to benefit from the data synergies that you and your customers might otherwise enjoy.


How to Achieve Data Quality in the Cloud

Everybody knows data quality is essential. Most companies spend significant money and resources trying to improve data quality. However, despite these investments, companies lose money yearly because of insufficient data, ranging from $9.7 million to $14.2 million annually. Traditional data quality programs do not work well for identifying data errors in cloud environments because:Most organizations only look at the data risks they know, which is likely only the tip of an iceberg. Usually, data quality programs focus on completeness, integrity, duplicates and range checks. However, these checks only represent 30 to 40 percent of all data risks. Many data quality teams do not check for data drift, anomalies or inconsistencies across sources, contributing to over 50 percent of data risks. The number of data sources, processes and applications has exploded because of the rapid adoption of cloud technology, big data applications and analytics. These data assets and processes require careful data quality control to prevent errors in downstream processes. The data engineering team can add hundreds of new data assets to the system in a short period. 



Quote for the day:

"Problem-solving leaders have one thing in common: a faith that there's always a better way." -- Gerald M. Weinberg

Daily Tech Digest - August 14, 2022

Identity crisis: Artificial intelligence and the flawed logic of ‘mind uploading’

We can think of the copy as a digital clone or twin, but it would not be you. It would be a mental copy of you, including all of your memories up to the moment your brain was scanned. But from that time on, the copy would generate its own memories inside whatever simulated world it was installed in. It might interact with other simulated people, learning new things and having new experiences. Or maybe it would interact with the physical world through robotic interfaces. At the same time, the biological you would be generating new memories and skills and knowledge. In other words, your biological mind and your digital copy would immediately begin to diverge. They would be identical for one instant and then grow apart. Your skills and abilities would diverge. Your knowledge and understanding would diverge. Your personality and objectives would diverge. After a few years, there would be significant differences. And yet, both versions would “feel like the real you.” This is a critical point – the copy would have the same feelings of individuality that you have. 


It’s Time to Normalize Cyberattack Data

The hope is that as an open standard, it will be adopted and used with existing security standards and processes. Then, as developers and users incorporate OCSF into their products and processes, security data normalization will become simpler and less burdensome. This, in turn, will enable security teams to do better at analyzing attack data, identifying threats, and defending their organizations from cyberattacks. Ultimately, John Graham-Cumming, Cloudflare’s CTO, said in a statement, “Every business deserves a simple, straightforward way to analyze and understand the security landscape — and that starts with their data. By participating in the OCSF, we hope to help the entire security industry focus on doing the work that matters instead of wasting countless hours and resources on formatting data.” I hope this is true. I hate wasting time. And time is one thing we never have enough of when we’re dealing with a security problem. If OSCF can succeed in its aims, it will be a major step forward in dealing with large-scale security problems.


3 Expert-Backed Strategies for Boosting Your Entrepreneurial Energy

Entrepreneurs are a special breed of overthinkers. We're constantly making decisions, so we have to think fast on our feet. But we also must take the time to weigh our options out properly. And so we think up all possible scenarios: the good, the bad and the ugly. This used to be one of my biggest hurdles when starting. What if this client falls through? What if users aren't satisfied with our product? What if we can't attract enough attention and be sustainable? What will I do? My mind was my biggest enemy. Consequently, after a long night of tossing and turning, I'd wake up unmotivated to start the day. Here's the thing I've learned since: energy thrives on confidence. And confidence only comes when you believe in your abilities. As co-authors Linda Bloom, L.C.S.W., and Charlie Bloom, M.S.W.. write in Psychology Today, "Self-trust is not trusting yourself to know all the answers, nor is it believing that you will always do the right things," they explain. "It's having the conviction that you will be kind and respectful to yourself regardless of the outcome of your efforts."


4 Flaws, Other Weaknesses Undermine Cisco ASA Firewalls

"If you have access to the virtual machine, you have full access inside the network, but more importantly, you can sniff all the traffic going through, including decrypted VPN traffic," Baines says. "So, it is a really great place for an attacker to chill out and pivot, but probably just sniff for credentials or monitor the traffic flowing into the network." Baines discovered the issue when he was investigating the Cisco ASDM to get "a level set on how the GUI (graphical user interface) works" and pull apart the protocol, he says. A component installed on administrators' systems, known as the ASDM launcher, could be used by attackers to deliver malicious code in Java class files or through the ASDM Web portal. As a result, attackers could create a malicious ASDM package to compromise the administrator's system through installers, malicious Web pages, and malicious Java components. The ASDM vulnerabilities discovered by Rapid7 include a known vulnerability (CVE-2021-1585) that allows an unauthenticated remote code execution (RCE) attack, which Cisco claimed was patched in a recent update, but Baines discovered it remained.


A Shift in Computer Vision Is Coming

Is computer vision about to reinvent itself, again? Ryad Benosman, professor of ophthalmology at the University of Pittsburgh and an adjunct professor at the CMU Robotics Institute, believes that it is. As one of the founding fathers of event-based vision technologies, Benosman expects that neuromorphic vision — computer vision based on event-based cameras — will be the next direction computer vision will take. “Computer vision has been reinvented many, many times,” Benosman said. “I’ve seen it reinvented twice at least, from scratch, from zero.” Benosman cited the shift in the 1990s from image processing with a bit of photogrammetry to a geometry-based approach and then to today’s rapid advance toward machine learning. Despite those changes, modern computer-vision technologies are still predominantly based on image sensors — cameras that produce an image similar to what the human eye sees. According to Benosman, until the image-sensing paradigm is no longer useful, it holds back innovation in alternative technologies. The development of high-performance processors, such as GPUs, delay the need to look for alternative solutions and thus have prolonged this effect.


What’s the Go programming language really good for?

Go has been compared to scripting languages like Python in its ability to satisfy many common programming needs. Some of this functionality is built into the language itself, such as “goroutines” for concurrency and threadlike behavior, while additional capabilities are available in Go standard library packages, like Go’s http package. Like Python, Go provides automatic memory management capabilities including garbage collection. Unlike scripting languages such as Python, Go code compiles to a fast-running native binary. And unlike C or C++, Go compiles extremely fast—fast enough to make working with Go feel more like working with a scripting language than a compiled language. Further, the Go build system is less complex than those of other compiled languages. It takes few steps and little bookkeeping to build and run a Go project. ... Go binaries run more slowly than their C counterparts, but the difference in speed is negligible for most applications. Go performance is as good as C for the vast majority of work, and generally much faster than other languages known for speed of development.


Ex-CIA security boss predicts coming crackdown on spyware

Protecting individuals' privacy is something all of us — including elected officials — should be very concerned about, Mestrovich said. "I would expect, going forward, there will be either executive orders or legislation passed to ensure that the civil liberties and the rights that we all expect to data privacy and privacy of our own activities are kept sacrosanct," he added. As a CISO himself, ransomware is top of mind. "Ransomware is a huge threat to just our economic viability," Mestrovich told us, citing a Cybersecurity Ventures forecast that global cybercrime costs to grow by 15 percent per year over the next five years, reaching $10.5 trillion annually by 2025. "Clearly, the cyber criminals have monetized the theft of data or depriving an organization use of its data," Mestrovich said. "Until we can do something to prevent the economic gain that they have from the theft of data or the denial of an organization's access to his data. This is only going to increase"


Urgent security warning issued as hackers shift ransomware attacks to small businesses

The Director of the NCSC Richard Browne said that in the past these groups typically focussed on larger organisations. However they have now shifted focus to smaller entities. “We have been dealing with the threat of ransomware for some time; however, we have seen a noticeable change in the tactics of criminal ransomware groups, whereby rather than largely focussing on Governments, critical infrastructure and big business, they are increasingly targeting smaller businesses. “This is a trend that has been observed globally, and Ireland is no exception with several businesses becoming victims of these groups in the past number of weeks,” he said. Richard Browne said the letter sent to IBEC by the NCSC and GNCCB has outlined guidance for small companies and how they can deal with the attack. “Whilst we appreciate that many business owners are understandably nervous of the threat ransomware poses, there are some straightforward security measures that can be put in place to ensure that an organisations data and systems remain secure,” he added.


Computer Vision and Deep Learning for Agriculture

AI applications can analyze weather and soil conditions, water usage, and risk of diseases to help farmers reduce the risk of crop failures by providing valuable insights like the right time to sow seeds, right crop/seed choices. Detecting plant diseases, weeds, and pests beforehand can reduce the use of chemicals like herbicides and pesticides and bring cost savings. Many companies have started using robots that can eliminate 80% of the volume of the substances generally sprayed on the crops and bring down the expenditure on herbicides by 90% Further, the use of AI in harvesting, picking, and vacuum apparatus can quickly identify the location of the harvestable produce and help determine the proper fruits. The Strawberry Harvest is a classic example. ... With satellite imagery and weather data, AI applications can analyze the market trends, like which crops are in demand and which are more profitable. This helps the farmers to increase their revenue by guiding them about future price patterns, demand level, type of crop to sow for maximum benefit, pesticide usage, etc.


Rethinking Web Application Firewalls

The vulnerabilities are so numerous now and cloud native applications have larger attack surfaces with no way to mitigate vulnerabilities using traditional means, Tiperneni explained. “It’s no longer sufficient to throw out a report that tells you about all the vulnerabilities in your system,” Tiperneni said. “Because that report is not actionable. People operating the services are discovering that the amount of time and effort it takes to remediate all these vulnerabilities is incredible, right? So they’re looking for some level of prioritization in terms of where to start.” And the onus is on the user to mitigate the problem, Tiperneni said. Those customers have to think about the blast radius of the vulnerability and its context in the system. The second part: How to manage the attack surface. In this world of cloud native applications, customers are discovering very quickly, that trying to protect every single thing, when everything has access to everything else, is an almost impossible task, Tiperneni said.



Quote for the day:

"The Leadership Seduction of storytelling invites self-pity, exaggerates one's importance, and encourages inaction." -- Catherine Robinson-Walker

Daily Tech Digest - April 29, 2022

Scrumfall: When Agile becomes Waterfall by another name

Agile is supposed to be centered on people, not processes — on people collaborating closely to solve problems together in a culture of autonomy and mutual respect, a sustainable culture that values the health, growth, and satisfaction of every individual. There is a faith embedded in the manifesto that this approach to software engineering is both necessary and superior to older models, such as Waterfall. Necessary because of the inherent complexity and indeterminacy of software engineering. Superior because it leverages the full collaborative might of everyone’s intelligence. But this is secondary to Agile’s most fundamental idea: We value people. It’s a rare employer today who doesn’t pay lip service to that idea. “We value our people.” But many businesses instead prioritize controlling their commodity human resources. This now being unacceptable to say out loud — in software engineering circles as in much of modern America — many companies have dressed it up in Scrum’s clothing, claiming Agile ideology while reasserting Waterfall’s hierarchical micromanagement.


Nerd Cells, ‘Super-Calculating’ Network in the Human Brain Discovered

After five years of research into the theory of the continuous attractor network, or CAN, Charlotte Boccara and her group of scientists at the Institute of Basic Medical Sciences at the University of Oslo, now at the Center for Molecular Medicine Norway (NCMM), have made a breakthrough. “We are the first to clearly establish that the human brain actually contains such ‘nerd cells’ or ‘super-calculators’ put forward by the CAN theory. We found nerve cells that code for speed, position and direction all at once,” says Boccara. ... The CAN theory hypothesizes that a hidden layer of nerve cells perform complex math and compile vast amounts of information about speed, position and direction, just as NASA’s scientists do when they are adjusting a rocket trajectory. “Previously, the existence of the hidden layer was only a theory for which no clear proof existed. Now we have succeeded in finding robust evidence for the actual existence of such a brain’s ‘nerd center,'” says the researcher,—and as such we fill in a piece of the puzzle that was missing.


Data Center Sustainability Using Digital Twins And Seagate Data Center Sustainability

Rozmanith said that Dessault’s digital twins data center construction simulation reduced time to market by 15%. He also said that the modular approach reduces design time by 20%. Their overall goal is to shorten data center stand-up time by 50% and reduce the waste commonly generated in data center construction. Even after construction, digital twins for the operation of a data center will be useful for evaluating and planning future upgrades and data center changes. Some data center companies, such as Apple have designed their data centers to be 100% sustainable for several years. Seagate recently announced that it would power its global footprint with 100% renewable energy by 2030 and achieve carbon neutrality by 2040. These goals were announced in conjunction with the release of the company’s 16th Global Citizenship Annual Report. That report included a look at the company’s annual progress towards meeting emission reduction targets, product stewardship, talent enablement, diversity goals, labor standards, fair trade, supply chain, and more.


Industry 4.0 – why smart manufacturing is moving closer to the edge

With Industry 4.0, new technologies are being built into the factory to drive increased automation. This all leads to potentially smart factories that can, for instance, benefit from predictive maintenance, as well as improved quality assurance and worker safety. At the same time, existing data challenges can be overcome. Companies operating across multiple locations often struggle to remove data silos and bring IT and OT (operational technology) together. An edge based on an open hybrid infrastructure can help them do this, as well as solving other problems. These problems include reducing latency as a result of supporting a horizontal data framework across the organization's entire IT infrastructure, instead of relying on data being funneled through a centralized network that can cause bottlenecks. Edge computing opens hybrid-aligned to cloud services can also reduce the amount of mismatched and inefficient hardware that has gradually built up, and which is located in often tight remote spaces too.


Digital twins: The art of the possible in product development and beyond

Digital twins are increasingly being used to improve future product generations. An electric-vehicle (EV) manufacturer, for example, uses live data from more than 80 sensors to track energy consumption under different driving regimes and in varying weather conditions. Analysis of that data allows it to upgrade its vehicle control software, with some updates introduced into new vehicles and others delivered over the air to existing customers. Developers of autonomous-driving systems, meanwhile, are increasingly developing their technology in virtual environments. The training and validation of algorithms in a simulated environment is safer and cheaper than real-world tests. Moreover, the ability to run numerous simulations in parallel has accelerated the testing process by more than 10,000 times. ... The adoption of digital twins is currently gaining momentum across industries, as companies aim to reap the benefits of various types of digital twins. Given the many different shapes and forms of digital twins, and the different starting points of each organization, a clear strategy is needed to help prioritize where to focus digital-twin development and what steps to take to capture the most value.


What Is Cloud-Native?

Cloud-native, according to most definitions, is an approach to software design, implementation, and deployment that aims to take full advantage of cloud-based services and delivery models. Cloud-native applications also typically operate using a distributed architecture. That means that application functionality is broken into multiple services, which are then spread across a hosting environment instead of being consolidated on a single server. Somewhat confusingly, cloud-native applications don't necessarily run in the cloud. It's possible to build an application according to cloud-native principles and deploy it on-premises using a platform such as Kubernetes, which mimics the distributed, service-based delivery model of cloud environments. Nonetheless, most cloud-native applications run in the cloud. And any application designed according to cloud-native principles is certainly capable of running in the cloud. ... Cloud-native is a high-level concept rather than a specific type of application architecture, design, or delivery process. Thus, there are multiple ways to create cloud-native software and a variety of tools that can help do it.


Predictive Analytics Could Very Well Be The Future Of Cybersecurity

Predictive analytics is gaining momentum in every industry, enabling organizations to streamline the way they do business. This branch of advanced analytics is concerned with the use of data, statistical algorithms, and machine learning to determine future performance. When it comes to data breaches, predictive analytics is making waves. Enterprises with a limited security staff can stay safe from intricate attacks. Predictive analytics tells them where threat actors tried to attack in the past, so it helps to see where they’ll strike next. Good security starts with knowing what attacks are to be feared. The conventional approach to fighting cybercrime is collecting data about malware, data breaches, phishing campaigns, and so on. Relevant information is extracted from those signatures. By signatures, it’s meant a one-of-a-kind arrangement of information that can be used to identify a cybercriminal’s attempt to exploit an operating system or an app’s vulnerability. The signatures can be compared against files, network traffic, and emails that flow in and out of the network to detect abnormalities. Everyone has distinct usage habits that technology can learn.


A Shift in Computer Vision is Coming

Neuromorphic technologies are those inspired by biological systems, including the ultimate computer, the brain and its compute elements, the neurons. The problem is that no–one fully understands exactly how neurons work. While we know that neurons act on incoming electrical signals called spikes, until relatively recently, researchers characterized neurons as rather sloppy, thinking only the number of spikes mattered. This hypothesis persisted for decades. More recent work has proven that the timing of these spikes is absolutely critical, and that the architecture of the brain is creating delays in these spikes to encode information. Today’s spiking neural networks, which emulate the spike signals seen in the brain, are simplified versions of the real thing — often binary representations of spikes. “I receive a 1, I wake up, I compute, I sleep,” Benosman explained. The reality is much more complex. When a spike arrives, the neuron starts integrating the value of the spike over time; there is also leakage from the neuron meaning the result is dynamic. There are also around 50 different types of neurons with 50 different integration profiles.


Implementing a Secure Service Mesh

One of our main goals with using a service mesh was to get Mutual Transport Layer Security (mTLS) between internal pod services for security. However, using a service mesh provides many other benefits because it allows workloads to talk between multiple Kubernetes clusters or run 100% bare-metal apps connected to Kubernetes. It offers tracing, logging around connections between pods, and it can output connection endpoint health metrics to Prometheus. This diagram shows what a workload might look like before implementing a service mesh. In the example on the left, teams are spending time building pipes instead of building products or services, common functionality is duplicated across services, there are inconsistent security and observability practices, and there are black-box implementations with no visibility. On the right, after implementing a service mesh, the same team can focus on building products and services. They’re able to build efficient distributed architectures that are ready to scale, observability is consistent across multiple platforms, and it’s easier to enforce security and compliance best practices.


5 Must-Have Features of Backup as a Service For Hybrid Environments

New backup as a service offerings have redefined backup and recovery with the simplicity and flexibility of the cloud experience. Cloud-native services can eliminate complexity of protecting your data and free you from the day-to-day hassles of managing the backup infrastructure. The innovative approach to backup lets you meet SLAs in hybrid cloud environments, and simplifies your infrastructure, driving significant value for your organization. Resilient data protection is key to always-on availability for data and applications in today’s changing hybrid cloud environments. While every organization has its own set of requirements, I would advise you to focus on cost efficiency, simplicity, performance, scalability, and future-readiness when architecting your strategy and evaluating new technologies. The simplest choice: A backup as a service solution that integrates all of these features in a pay-as-you-go consumption model. Modern solutions are architected to support today’s challenging IT environments.



Quote for the day:

"Leadership is like beauty; it's hard to define, but you know it when you see it." -- Warren Bennis