Showing posts with label bio tech. Show all posts
Showing posts with label bio tech. Show all posts

Daily Tech Digest - January 06, 2025

Should States Ban Mandatory Human Microchip Implants?

“U.S. states are increasingly enacting legislation to pre-emptively ban employers from forcing workers to be ‘microchipped,’ which entails having a subdermal chip surgically inserted between one’s thumb and index finger," wrote the authors of the report. "Internationally, more than 50,000 people have elected to receive microchip implants to serve as their swipe keys, credit cards, and means to instantaneously share social media information. This technology is especially popular in Sweden, where chip implants are more widely accepted to use for gym access, e-tickets on transit systems, and to store emergency contact information.” ... “California-based startup Science Corporation thinks that an implant using living neurons to connect to the brain could better balance safety and precision," Singularity Hub wrote. "In recent non-peer-reviewed research posted on bioarXiv, the group showed a prototype device could connect with the brains of mice and even let them detect simple light signals.” That same piece quotes Alan Mardinly, who is director of biology at Science Corporation, as saying that the advantages of a biohybrid implant are that it "can dramatically change the scaling laws of how many neuros you can interface with versus how much damage you do to the brain."


AI revolution drives demand for specialized chips, reshaping global markets

There’s now a shift toward smaller AI models that only use internal corporate data, allowing for more secure and customizable genAI applications and AI agents. At the same time, Edge AI is taking hold, because it allows AI processing to happen on devices (including PCs, smartphones, vehicles and IoT devices), reducing reliance on cloud infrastructure and spurring demand for efficient, low-power chips. “The challenge is if you’re going to bring AI to the masses, you’re going to have to change the way you architect your solution; I think this is where Nvidia will be challenged because you can’t use a big, complex GPU to address endpoints,” said Mario Morales, a group vice president at research firm IDC. “So, there’s going to be an opportunity for new companies to come in — companies like Qualcomm, ST Micro, Renesas, Ambarella and all these companies that have a lot of the technology, but now it’ll be about how to use it. ... Enterprises and other organizations are also shifting their focus from single AI models to multimodal AI, or LLMs capable of processing and integrating multiple types of data or “modalities,” such as text, images, audio, video, and sensory input. The input from diverse resources creates a more comprehensive understanding of that data and enhances performance across tasks.


How to Address an Overlooked Aspect of Identity Security: Non-human Identities

Compromised identities and credentials are the No. 1 tactic for cyber threat actors and ransomware campaigns to break into organizational networks and spread and move laterally. Identity is the most vulnerable element in an organization’s attack surface because there is a significant misperception around what identity infrastructure (IDP, Okta, and other IT solutions) and identity security providers (PAM, MFA, etc.) can protect. Each solution only protects the silo that it is set up to secure, not an organization’s complete identity landscape, including human and non-human identities (NHIs), privileged and non-privileged users, on-prem and cloud environments, IT and OT infrastructure, and many other areas that go unmanaged and unprotected. ... Most organizations use a combination of on-prem management tools, a mix of one or more cloud identity providers (IdPs), and a handful of identity solutions (PAM, IGA) to secure identities. But each tool operates in a silo, leaving gaps and blind spots that cause increased attacks and blind spots. 8 out of 10 organizations cannot prevent the misuse of service accounts in real-time due to visibility and security being sporadic or missing. NHIs fly under the radar as security and identity teams sometimes don’t even know they exist. 


Version Control in Agile: Best Practices for Teams

With multiple developers working on different features, fixes, or updates simultaneously, it’s easy for code to overlap or conflict without clear guidelines. Having a structured branching approach prevents confusion and minimizes the risk of one developer’s work interfering with another’s. ... One of the cornerstones of good version control is making small, frequent commits. In Agile development, progress happens in iterations, and version control should follow that same mindset. Large, infrequent commits can cause headaches when it’s time to merge, increasing the chances of conflicts and making it harder to pinpoint the source of issues. Small, regular commits, on the other hand, make it easier to track changes, test new functionality, and resolve conflicts early before they grow into bigger problems. ... An organized repository is crucial to maintaining productivity. Over time, it’s easy for the repository to become cluttered with outdated branches, unnecessary files, or poorly named commits. This clutter slows down development, making it harder for team members to navigate and find what they need. Teams should regularly review their repositories and remove unused branches or files that are no longer relevant. 


Abusing MLOps platforms to compromise ML models and enterprise data lakes

Machine learning operations (MLOps) is the practice of deploying and maintaining ML models in a secure, efficient and reliable way. The goal of MLOps is to provide a consistent and automated process to be able to rapidly get an ML model into production for use by ML technologies. ... There are several well-known attacks that can be performed against the MLOps lifecycle to affect the confidentiality, integrity and availability of ML models and associated data. However, performing these attacks against an MLOps platform using stolen credentials has not been covered in public security research. ... Data poisoning: This attack involves an attacker having access to the raw data being used in the “Design” phase of the MLOps lifecycle to include attacker-provided data or being able to directly modify a training dataset. The goal of a data poisoning attack is to be able to influence the data that is being trained in an ML model and eventually deployed to production. ... Model extraction attacks involve the ability of an attacker to steal a trained ML model that is deployed in production. An attacker could use a stolen model to extract sensitive training data such as the training weights used, or to use the predictive capabilities used in the model for their own financial gain. 


Get Going With GitOps

GitOps implementations have a significant impact on infrastructure automation by providing a standardized, repeatable process for managing infrastructure as code, Rose says. The approach allows faster, more reliable deployments and simplifies the maintenance of infrastructure consistency across diverse environments, from development to production. "By treating infrastructure configurations as versioned artifacts in Git, GitOps brings the same level of control and automation to infrastructure that developers have enjoyed with application code." ... GitOps' primary benefit is its ability to enable peer review for configuration changes, Peele says. "It fosters collaboration and improves the quality of application deployment." He adds that it also empowers developers -- even those without prior operations experience -- to control application deployment, making the process more efficient and streamlined. Another benefit is GitOps' ability to allow teams to push minimum viable changes more easily, thanks to faster and more frequent deployments, says Siri Varma Vegiraju, a Microsoft software engineer. "Using this strategy allows teams to deploy multiple times a day and quickly revert changes if issues arise," he explains via email. 


Balancing proprietary and open-source tools in cyber threat research

First, it is important to assess the requirements of an organization by identifying the capabilities needed, such as threat intelligence platforms or malware analysis tools. Next, evaluating open-source tools which can be cost-effective and customizable, but may require community support and frequent updates. In contrast, proprietary tools could offer advanced features, dedicated support, and better integration with other products. Finally, think about scalability and flexibility, as future growth may necessitate scalable solutions. ... The technology is not magic, but it is a powerful tool to speed up processes and bolster security procedures while also reducing the gap between advanced and junior analysts. However, as of today, the technology still requires verification and validation. Globally, the need for security experts with a dual skill set in security and AI will be in high demand. Because the adoption of generative AI systems increases, we need people who understand these technologies because threat actors are also learning. ... If a CISO needs to evaluate effectiveness of these tools, they first need to understand their needs and pain points and then seek guidance from experts. Adopting generative AI security solutions just because it is the latest trend is not the right approach.


Get your IT infrastructure AI-ready

Artificial intelligence adoption is a challenge many CIOs grapple with as they look to the future. Before jumping in, their teams must possess practical knowledge, skills, and resources to implement AI effectively. ... AI implementation is costly and the training of AI models requires a substantial investment. "To realize the potential, you have to pay attention to what it's going to take to get it done, how much it's going to cost, and make sure you're getting a benefit," Ramaswami said. "And then you have to go get it done." GenAI has rapidly transformed from an experimental technology to an essential business tool, with adoption rates more than doubling in 2024, according to a recent study by AI at Wharton ... According to Donahue, IT teams are exploring three key elements: choosing language models, leveraging AI from cloud services, and building a hybrid multicloud operating model to get the best of on-premise and public cloud services. "We're finding that very, very, very few people will build their own language model," he said. "That's because building a language model in-house is like building a car in the garage out of spare parts." Companies look to cloud-based language models, but must scrutinize security and governance capabilities while controlling cost over time. 


What is an EPMO? Your organization’s strategy navigator

The key is to ensure the entire strategy lifecycle is set up for success rather than endlessly iterating to perfect strategy execution. Without properly defining, governing, and prioritizing initiatives upfront, even the best delivery teams will struggle to achieve business goals in a way that drives the right return for the organization’s investment. For most organizations, there’s more than one gap preventing desired results. ... The EPMO’s job is to strip away unnecessary complexity and create frameworks that empower teams to deliver faster, more effectively, and with greater focus. PMO leaders should ask how this process helps to hit business goals faster. So by eliminating redundant meetings and scaling governance to match project size and risk, delivery timelines can shorten. This kind of targeted adjustment keeps momentum high without sacrificing quality or control. ... For an EPMO to be effective, ideally it needs to report directly to the C-suite. This matters because proximity equals influence. When the EPMO has visibility at the top, it can drive alignment across departments, break down silos, drive accountability, and ensure initiatives stay connected to overall business objectives serving as the strategy navigator for the C-suite.


Data Center Hardware in 2025: What’s Changing and Why It Matters

DPUs can handle tasks like network traffic management, which would otherwise fall to CPUs. In this way, DPUs reduce the load placed on CPUs, ultimately making greater computing capacity available to applications. DPUs have been around for several years, but they’ve become particularly important as a way of boosting the performance of resource-hungry workloads, like AI training, by completing AI accelerators. This is why I think DPUs are about to have their moment. ... Recent events have underscored the risk of security threats linked to physical hardware devices. And while I doubt anyone is currently plotting to blow up data centers by placing secret bombs inside servers, I do suspect there are threat actors out there vying to do things like plant malicious firmware on servers as a way of creating backdoors that they can use to hack into data centers. For this reason, I think we’ll see an increased focus in 2025 on validating the origins of data center hardware and ensuring that no unauthorized parties had access to equipment during the manufacturing and shipping processes. Traditional security controls will remain important, too, but I’m betting on hardware security becoming a more intense area of concern in the year ahead.



Quote for the day:

"Nothing in the world is more common than unsuccessful people with talent." -- Anonymous

Daily Tech Digest - September 26, 2020

Steering Wealth Management Industry Through Digital Transformation In The Post Pandemic World

Implement ready to use digital solutions and change internal processes instead of starting from scratch to build solutions to cater to its processes. Don’t shy from exploring global solutions, you would most likely get a great product which may not be expensive. Insist on following the Methodology of “Pay as you Use or Pay as you Grow” instead of incurring significant implementation charges and license fees.  Explore working with StartUps who are hungry for businesses and will go out of the way to build great solutions. A robust database for sending relevant, targeted and personalized communications  Make a beginning and take baby steps. Focus on 90% of your requirements. Lot of time and energy is spent on addressing 10% of requirements which can be done manually or there could be a work around We are at the cusp of a brave, new world that demands self-sufficiency, and it is becoming rapidly clear that greater digital freedom will play a pivotal role in making the Industry more effective, scalable and enduring on this uncharted road ahead. Firms that deploy these tools fast will attract clients and survive. The Industry has always been one to shy away from digital transformation.


Layered security becomes critical as malware attacks rise

The scam script Trojan.Gnaeus made its debut at the top of WatchGuard’s top 10 malware list for Q2, making up nearly one in five malware detections. Gnaeus malware allows threat actors to hijack control of the victim’s browser with obfuscated code, and forcefully redirect away from their intended web destinations to domains under the attacker’s control. Another popup-style JavaScript attack, J.S. PopUnder, was one of the most widespread malware variants last quarter. In this case, an obfuscated script scans a victim’s system properties and blocks debugging attempts as an anti-detection tactic. To combat these threats, organizations should prevent users from loading a browser extension from an unknown source, keep browsers up to date with the latest patches, use reputable adblockers and maintain an updated anti-malware engine. XML-Trojan.Abracadabra is a new addition to the top 10 malware detections list, showing a rapid growth in popularity since the technique emerged in April. Abracadabra is a malware variant delivered as an encrypted Excel file with the password “VelvetSweatshop”, the default password for Excel documents.


Want diversity? Move beyond your closed network

In earnest, the difficulty of recruiting diverse candidates reflects the fact that the networks the banking industry typically relies upon to attract and recruit talent do not reach diverse pools of talented candidates. This network gap is insidious too, leading to a lack of diversity in other aspects of business, like vendor procurement and investment. Once, Mitt Romney spoke of “binders full of women” when running for president. While his wording was inartful, he seemed to recognize that he needed to make a deliberate effort to build his network of talented women in order to be able to appoint numbers of qualified women. So, what deliberate steps can banks take to close the network gap and find talented people of color? Here are a few things any bank can do to turn intention into impact, and close the network gap. Begin with reflection: Why are you not tied to diverse networks? Do you know where to find black and brown civil society? Learning why your company may not be a cultural fit for certain demographics is nothing new for banks. Gender is probably the most recent example. Understanding that women bring different and needed experience to leadership creates an impetus for more diversity.


Why No One Understands Enterprise Architecture & Why Technology Abstractions Always Fail

The first step is demystification. All of the abstract terms – even the word “architecture” – should be modified or replaced with words and phrases that everyone – especially non-technology executives – can understand. Enterprise planning or Enterprise Business- Technology Strategy might be better, or even just Business-Technology Strategy (BTS). Why? Because “Enterprise Architecture” is nothing more than an alignment exercise, alignment between what the business wants to do and how the technologists will enable it now and several years out. It’s continuous because business requirements constantly change. At the end of the day, EA is both a converter and a bridge: a converter of strategy and a bridge to technology. The middle ground is the Business-Technology Strategy. EA – or should I say “Business Technology Strategy” – isn’t strategy’s first cousin, it’s the offspring. EA only makes sense when it’s derived from a coherent business strategy. For technology companies, that is, companies that sell technology-based products and services – the role of EA is easier to define. Who doesn’t want to help technology (AKA “engineering”) – the ones who build the products and services – build the right applications with the right data on the right infrastructure?


Types of Apps that can be built with Angular Framework

Undoubtedly, Angular development is almost everywhere after it was released in 2009. A few years back, Angular development services are on great boom. Angular is considered the best framework for developing web, single-page, and mobile applications. The Angular framework has impressive features; the developers and enterprise website owners pretty much like it. Even most of the developers shifted their technology to angular. Before knowing why angular mobile app development and what sort of applications can be developed using an angular framework, let’s first dive into the topic of what exactly Angular framework is? Angular is a JavaScript-based framework from the family of Google. The angular framework was developed by Google’s developers to create dynamic web applications. Angular is a full-fledged framework used for the frontend development of an application. Angular has a lot to give to your web and mobile application. Angular will not only create an impressive UI for your application but also provide features like high performance and user-friendly. As a feature-rich framework, Angular provides a vast number of features for web application developers.


WebAssembly Could Be the Key for Cloud Native Extensibility

Google had been championing the idea of making WebAssembly a common runtime for Envoy, as a way to help its own Istio service mesh, of which Envoy is a major component. WASM is faster than JavaScript and, because it runs in a sandbox (a virtual machine), it is secure and portable. Perhaps best of all, because it is very difficult to write assembly-like WASM code, many parties created translators for other languages — allowing developers to use their favored languages such as C and C++, Python, Go, Rust, Java, and PHP. Google and the Envoy community also rallied around building a WebAssembly System Interface (WASI), which serves as the translation layer between the WASM and the Envoy filter chain. Still, the experience of building Envoy modules wasn’t packaged for developers, Levine thought at the time. There was still a lot of plumbing to add, settings for Istio and the like. ““Google is really good at making infrastructure tooling. But I’d argue they’re not the best at making their user experience,” Levine said. And much like Docker customized the Linux LXC — pioneered in large part by Google — to open container technology to more developers, so too could the same be done with WASM/WASI for Envoy, Levine argues.


Amazon's robot drone flying inside our homes seems like a bad idea

Amazon says you can specify a flight path, map your house, locate points of interest, and generally instruct the eye of Skynet where to fly. Cyberdyne, uh, Amazon also says the device has built in obstacle avoidance. Let's think about that for a minute. Will the device be able to avoid hanging lamps or plants? What about objects high up on shelves? Will it be able to stand back when a sleep-addled adult gets up in the middle of the night to do middle of the night business? Why would it be out and about at that time anyway? And what about the downdraft? How close can it fly to bookshelves and knickknacks without air-blasting them to the ground? How much will it freak out your pets? My spouse? Your spouse? Just how creepy would it be for it to hover over the kids beds because you're too lazy to get off the couch to see if they're asleep? Every rational fiber of my being tells me this is wrong on every level. ... The Always Home Cam is primarily meant as a remote security cam. If you're out and you get an alert from a Ring doorbell or other security device (I wonder if this will work with other trigger devices), you can virtually fly around your house and see what's happening.


Project InnerEye open-source deep learning toolkit: Democratizing medical imaging AI

Project InnerEye has been working closely with the University of Cambridge and Cambridge University Hospitals NHS Foundation Trust to make progress on this problem through a deep research collaboration. Dr. Raj Jena, Group Leader in machine learning and radiomics in radiotherapy at the University of Cambridge, explains, “The strongest testament to the success of the technology comes in the level of engagement with InnerEye from my busy clinical colleagues. For over 15 years, the promise of automated segmentation of images for radiotherapy planning has remained unfulfilled. With the InnerEye ML model we have trained on our data, we now observe consistent segmentation performance to a standard that matches our stringent clinical requirements for accuracy.” The goal of Project InnerEye is to democratize AI for medical image analysis and empower developers at research institutes, hospitals, life science organizations, and healthcare providers to build their own medical imaging AI models using Microsoft Azure. So to make our research as accessible as possible, we are releasing the InnerEye Deep Learning Toolkit as open-source software.


How to Strengthen the Pillars of Data Analytics for Better Results

Data analysts and business analysts rely heavily on a fit-for-purpose data environment that enables them to do their jobs well. These environments allow them to answer questions from management and different parts of the business. These same professionals have expertise in working and communicating with data but often do not have deep technical knowledge of databases and the underlying infrastructure. For instance, they may be familiar with SQL and bringing together data sources in a simple data model that allows them to dig deeper in their analysis, but when the database performance degrades during more complex analysis, the depth of infrastructure reliance becomes clear. The dreaded spinner wheel or delays in analysis make it difficult to meet business needs and demands. This can impact critical decision making and reveal underlying weaknesses that get in the way of other data applications, such as artificial intelligence (AI). These indicators of poor performance also show the need for scaling the data environment to accommodate the growth of data and data sources.


The Role of Data Management in Advancing Biology

I think FAIR has really codified a way of thinking about data that's incredibly aspirational and resonates with people. One of the biggest challenges we're facing in this field right now is findability of the data—search is a hard problem. Then let's say you manage to find some data that you're very interested in; a lot of the time it's not clear whether or not those data are accessible to you or to the public. There's been a large push over the last decade to make everything reproducible, to make the data accessible, to have a data management plan. A lot of that effort isn't necessarily resourced, so just because you have a data management plan doesn't mean that you have a clear place where you can actually put data. We're lucky that the Sequence Read Archives exist and that the NIH continues to fund it, because that's become one of these major focal points for collecting the data. But even more than that, when you're in the middle of collecting data for a very specific question, you're not necessarily thinking about what other information to collect to make these data useful to other groups or other labs. That's not a part of the thought experiment that you're going through in that moment.



Quote for the day:

"A company is like a ship. Everyone ought to be prepared to take the helm." -- Morris Wilks

Daily Tech Digest - September 12, 2020

Women in Fintech: How Open Banking Can Help Address Data Bias

A disturbing recent example is the story of Jamie Heinemeier Hansson, who was granted permission to borrow 20 times less on her Apple Card than her husband David was. This was despite her having a better credit score, as well as the couple filing a joint tax return and having an equal share in their property. The Apple Card incident highlighted that computers are not impartial. Artificial intelligence may well be able to digest vast amounts of information and identify patterns far beyond the capability of humans, but the historical data from which such systems “learn” in order to draw conclusions can be biased, even if it is unintentional. So a system can make a discriminatory decision about a woman’s credit rating due to inherent bias in its training – for example, as women were less likely to have been granted credit, the algorithm continues that pattern – despite having not specifically asked her gender. However, many believe that while technology can perpetuate these biases, it could also be used to address them, particularly in the open banking era. “I genuinely believe technology can level the playing field fundamentally,” says Sam Seaton, CEO of Moneyhub. 


Simplify agile, devops, and ITSM with Jira automations

Jira automations work like other IFTTT algorithms, except they have access to all the underlying data and workflows within Jira Software. A Jira automation trigger can be one of several types, including Jira issue types, sprints, and versions. You can design automations for when team members add or modify Jira issues, when scrum masters start or complete sprints, or when team leads create, update, or release versions. These triggers are highly useful for scrum masters, product owners, and technical leads who want to simplify the work needed to keep Jira updated with high-quality data. Jira automation also supports triggers tied to devops events such as pull requests, builds, branches, commitments, and deployments. These events connect with Bitbucket, GitLab, and Github and update Jira issue or version status based on developer activities performed in version control. More advanced triggers can run on a defined schedule or respond to webhooks. Teams using these two triggers can get very creative with integrating Jira workflows with other tools or automating administrative tasks on a schedule. Once you configure the trigger, you have the option to add more filtering conditions or to branch the flow and support different sets of actions.


How trusted data is driving resilience and transformation beyond Covid-19

Over the next three to five years, most business workflows will be disrupted by the application of data and artificial intelligence (AI). Efficiency will be prioritised because it underpins business survival. If we take power and utilities as an example, we can expect disruption of the billing workflow, call centres, customer onboarding, customer service, and distribution. Document intelligence will also be used to glean insights from large volumes of information. Ultimately, data and AI will reinvent the entire end-to-end value chains of industries. Companies that recognise the strategic value of data will be the leaders in digital transformation, giving them a competitive position in the market. ... The pandemic has highlighted the value of data since having and sharing information on individuals will be key to defeating the virus. So, in the evolving normal, we can expect more data-sharing platforms – platforms that allow the public sector to share information with the private sector and platforms that allow different companies within the private sector to share information with each other. Boundaries between sectors will blur over time and regulation will adapt to accommodate data sharing.


Bluetooth Bug Opens Devices to Man-in-the-Middle Attacks

The Bluetooth SIG is recommending that potentially vulnerable Bluetooth implementations introduce the restrictions on CTKD that have been mandated in Bluetooth Core Specification versions 5.1 and later. These restrictions prevent the overwrite of an authenticated key or a key of a given length with an unauthenticated key or a key of reduced length. “The Bluetooth SIG is also broadly communicating details on this vulnerability and its remedies to our member companies and is encouraging them to rapidly integrate any necessary patches,” according to Bluetooth. “As always, Bluetooth users should ensure they have installed the latest recommended updates from device and operating system manufacturers.” Several Bluetooth-based attacks have cropped up over the past year. In May, academic researchers uncovered security vulnerabilities in Bluetooth Classic that could have allowed attackers to spoof paired devices and capture sensitive data. In February, meanwhile, a critical vulnerability in the Bluetooth implementation on Android devices was discovered that could allow attackers to launch remote code-execution (RCE) attacks – without any user interaction.


Australia’s very small step to make the Internet of Things safer

Security flaws in IoT devices are common. Hackers can exploit those vulnerabilities to take control of devices, steal or change data, and spy on us. In recognition of these risks, the Australian government has introduced a new code of practice to encourage manufacturers to make IoT devices more secure. The code provides guidance on secure passwords, the need for security patches, the protection and deletion of consumers’ personal data and the reporting of vulnerabilities, among other things. The problem is the code is voluntary. Experiences elsewhere, such as the United Kingdom, suggest a voluntary code will be insufficient to deliver the protections consumers need. ... A better option would have been a “co-regulatory” approach. Co-regulation mixes aspects of industry self-regulation with both government regulation and strong community input. It includes laws that create incentives for compliance (and disincentives against non-compliance) and regulatory oversight by an independent (and well-resourced) watchdog. The Australia government has, at least, described its new code of practice as “a first step” to improving the security of IoT devices.


Four ways network traffic analysis benefits security teams

The SecOps team will often need the network data and behavior insights for security analytics or compliance audits. This will usually require network metadata and packet data from physical, virtual and cloud-native elements of the network deployed across the data center, branch offices and multi-cloud environments. The easier it is to access, index and make sense out of this data (preferably in a “single pane of glass” solution), the more value it will provide. Obtaining this insight is entirely feasible but will require a mix of physical and virtual network probes and packet brokers to gather and consolidate data from the various corners of the network to process and deliver it to the security tool stack. NDR solutions can also offer the SecOps team the ability to capture and retain network data associated with indicators of compromise (IOCs) for fast forensics search and analysis in case of an incident. This ability to capture, save, sort and correlate metadata and packets allows SecOps to investigate breaches and incidents after the fact and determine what went wrong, and how the attack can be better recognized and prevented in the future.


A Beginner’s Introduction To DevOps Principles

To put it simply, DevOps is all about integrating these two teams together (hence the portmanteau of a name). It isn’t going to make your developers into sysadmins, or vice versa, but it should help them work together. Each aspect and phase is complemented with tools that make this whole process easier. DevOps is more than just tools and automation, and implementing a set of “DevOps tools” won’t automatically make your team work twice as fast, but these tools are a major part of the process, and it’d be hard to be as efficient without some of them. ... Rather than testing and building only once when everything is finished, in a DevOps environment, each developer will ideally submit changes to source control multiple times a day, whenever issues are complete or a minor milestone is reached. This allows the build and testing phases to start early, and make sure no developer gets too far away from the HEAD of the master source control. This stage is mostly about proper source control management, so having an effective git service like GitHub, Gitlab, or BitBucket are crucial to keeping continuous integration running smoothly. You don’t have to deploy every commit to production right away, but quick automated deployments are a major part of being able to push rapid releases.


It's the biggest job in tech. So why can't they find anyone to do it?

The failure to appoint a senior leader to coordinate the mammoth task of digitizing public services is at odds with the government's rhetoric. Three years ago, the UK re-iterated the need to create a "government as a platform" in a brand-new digital strategy, with the objective of harnessing the potential of digital to improve the efficiency of public services. The goal? To enable "digital by default" across government, and use technology and data to better serve citizens with digitally enabled public services that would be easier, simpler and cheaper. Since then, many reports have emerged stressing the difficulty of achieving this digital transformation journey without proper management from the very top. Last year, for instance, a report from the House of Commons' Science and Technology Committee found that the government's digital momentum was slowing, and that the shift was partly due to a lack of senior leadership. These failures have been especially palpable in the past few months. As the global COVID-19 pandemic threw the world upside down, the need for a government that effectively delivers digital services in a time of crisis became ever-more important.


Visa Warns of Fresh Skimmer Targeting E-Commerce Sites

The Visa alert does not indicate how Baka is initially delivered to a network. But the report notes that the malicious code is hosted on several suspicious domains, including: jquery-cycle[.]com, b-metric[.]com, apienclave[.]com, quicdn[.]com, apisquere[.]com, ordercheck[.]online and pridecdn[.]com. Once the initial infection takes hold, the skimmer is uploaded through the command-and-control server, but the code loads in memory. This means the malware is never present on the targeted e-commerce firm's server or saved to another device, helping it to avoid detection, according to the alert. "The skimming payload decrypts to JavaScript written to resemble code that would be used to render pages dynamically," according to Visa. Once embedded in an e-commerce site's checkout page, the skimmer begins to collect payment and other customer data from various fields and sends the information to the fraudsters' command-and-control server, Visa notes. Once data exfiltration is complete, Baka performs a "clean-up" function that removes the skimming code from the checkout page, according to the alert. This also helps ensure that JavaScript is not spotted by anti-malware tools.


Elon Musk is one step closer to connecting a computer to your brain

While the development of this futuristic-sounding tech is still in its early stages, the presentation was expected to demonstrate the second version of a small, robotic device that inserts tiny electrode threads through the skull and into the brain. Musk said ahead of the event he would “show neurons firing in real-time. The matrix in the matrix.” And he did just that. At the event, Musk showed off several pigs that had prototypes of the neural links implanted in their head, and machinery that was tracking those pigs’ brain activity in real time. The billionaire also announced the Food and Drug Administration had awarded the company a breakthrough device authorization, which can help expedite research on a medical device. Like building underground car tunnels and sending private rockets to Mars, this Musk-backed endeavor is incredibly ambitious, but Neuralink builds on years of research into brain-machine interfaces. A brain-machine interface is technology that allows for a device, like a computer, to interact and communicate with a brain. 




Quote for the day:

"The actions of a responsible executive are contagious." -- Joe D. Batton

Daily Tech Digest - August 28, 2019


Being able to replicate neural behaviour on an electronic chip also offers exciting avenues for research to better understand the brain and how it is affected by disorders that disrupt neural connections, such as Alzheimer’s disease and other forms of dementia. The human brain is made up of billions of neurons in connected networks. They communicate with each other by using a sequence of electrical signals to express different behaviours, such as learning through sensory organs or more complicated processes like emotions and memory. Any disruption to these signalling sequences can lead to a loss of these vital neural connections, potentially causing memory loss and dementia. Curing these disorders would require identifying the faulty neurons and restoring their signalling routine, without affecting the functioning of other neurons in the network. So by having a computer model of the brain, neuroscientists would be able to simulate brain functions and abnormalities, and work towards cures, without the need for living test subjects. Our technology could also potentially be incorporated into wearable electronics, bionic prosthetics, or smart gadgets imbued with artificial intelligence.



Securing Our Infrastructure: 3 Steps OEMs Must Take in the IoT Age

In the manufacturing world, specifically the operations technology (OT) sphere, legacy operational standards such as OPC and Modbus are still in use today but were designed more than 20 years ago using old technologies, including COM. They were not designed for communication over modern IP networks with multiple security layers and, due to a general lack of cybersecurity sophistication, traditional OT networks have most security options disabled to simplify configuration. By its nature, a large open network of connected devices opens many new attack vector threats, even if individual devices may be secure when used independently. Because the weakest point in the system determines its overall security level, a comprehensive end-to-end approach is required to secure it. The lack of industry standards within the manufacturing space makes it difficult to develop such an approach because hackers concentrate on breaching a specific element within the technology stack.


Ransomware has evolved into a serious enterprise threat


In addition to a ransomware revival, the report highlights that more than 2.2 billion stolen account credentials were made available on the cyber criminal underground in the first quarter and that 68% of targeted attacks used spear phishing for initial access. “This shows how the cyber crime economy works,” said Samani. “Credentials are sold online, other criminals buy the credentials and then use them to get into organisations and use the ransomware they are an affiliate for to infect an organisation and demand tens of thousands of dollars in ransom. “The purpose of the threat report is not just to give the hard stats, but to encourage organisations to look at everything that is going on and see it is all connected and contributes to the wider ecosystem of crime.” The findings on ransomware targeting businesses are consistent with the fact that ransomware and other forms of cyber extortion are currently the most popular forms of cyber criminal activity in the UK, according to Rob Jones


The World Is Taking The Future Of Payments Seriously. Why Isn't The United States?

uncaptioned
Let’s start with the simplest of the three: technological history. When modern-day payment systems were first developed, the United States was at the forefront of innovation and adoption. Debit and credit cards picked up significant momentum in the second half of the twentieth century. While shopping and paying online are now a global standard, it took time to filter into society. At the center were the thousands of e-commerce websites and companies that developed in the United States, particularly in Silicon Valley, in the late 90s and early 2000s. In the United States, all cards operate on the same point-of-sale systems to streamline the process for merchants. These outdated systems have left debit and credit cards as the historic standard, which is difficult to break out of. Point-of-sale systems have made China a fascinating case study. Historically, China has been slow to embrace new technologies, particularly in the consumer sector. Until about 10 years ago, the majority of transactions were made with cash; credit and debit cards were relatively rare in China’s payment ecosystem. When payment alternatives started to develop, it was roughly around the same time smartphones began to flood the market.


Do Self-Service and Low-Code Curb Shadow IT?

Image: Pixabay/Bykst
It’s important to point out there's an entire spectrum of low-code/no-code tools aimed at different audiences. Some are targeted at professional developers while others are targeted to web developers or citizen developers. The latter group tends to use “no-code” tools because the mechanics of writing code have been abstracted into visual drag-and-drop tools. Fintech company NES Financial standardized on Outsystems, which is an enterprise-class low-code platform because NES Financial voluntarily complies with Systems and Organizational Controls reporting (SOC 1), the Bank Secrecy Act (BSA), United States Citizenship and Immigration Service (USCIS) and Securities Exchange Commission (SEC) regulations. "Building systems and controlling data is an art in itself. You have to be aware of new regulations, requirements, and constraints, which is a full-time job," said Izak Joubert, CTO at NES Financial. "I think the ability for a marketing organization to implement something as a shadow IT organization is great conceptually, but it has massive risks for an organization if you look at it from a bigger perspective."


Tracking The Trajectory Of Cloud Computing

cloud
Despite the lack of coherent regulations, clients can use the cloud with confidence provided they know where their data is kept, which data protection laws apply, and whether the provider meets internal security policy. The cloud is multi tenancy by design – in other words, it brings lots of clients and third parties into the same network. Knowing which other organisations exist within the network, and how much data they will be able to access, is also a good move for service users. Cloud computing is changing: it’s smarter, faster, more powerful, and more popular than ever before. As technologies and industries converge, cloud applications will increase. However, the maturity of cloud computing has not been matched by regulations. Users are often uncertain about cloud compliance, and therefore less willing to rely on cloud based systems. Legal bodies and corporations need to come up with a prescriptive regulatory framework to enable the cloud to rise to its full potential.


Mitigating social engineering attacks with MFA


Providing a tool for employees to report phishing incidents, even just an email address for forwarding suspected phishing emails, can also help organisations. ... One technological solution that has proven successful against social engineering attacks, especially when the goal has been for acquiring access details, is the implementation of two-factor authentication. Two-factor authentication (2FA), and multifactor authentication (MFA), are access management systems that require two – or more – pieces of evidence, whether it be knowledge (such as passwords), possession (a physical token for example) or inherence (eg fingerprints) in order for access to be granted. The reason that 2FA/MFA is so successful is that should one of their verification stages (such as a password) become compromised, a hacker will still be unable to gain access to the organisation’s network without the other pieces of authentication.


Creating a 'Defensible' Cybersecurity Program

Business units also need to have input on the security steering committee to ensure that the security team is aligned with business goals. "It's very difficult to convince people that you are governing your security program from a business perspective if the business does not have a seat [on the steering committee]," Scholtz says. Dashboards or scorecards can be helpful for showing how security relates to the business and what the risk position is, Scholtz says. But implementing those takes time. Progress reports for executive boards can be tricky, Scholtz says. Executives don't need day-to-day operational information. Providing too much information may get executives interested in granular details that they ultimately have no control over, he points out. Scholtz's tips seem to offer a helpful start for setting up a cybersecurity program that supports business goals. But are they, indeed, practical? Let us know what you think.


Blurring the lines between RPA platforms and APIs


The capabilities of both RPA platforms and APIs are evolving to support use cases primarily handled by the others. The combination of RPA and APIs is a natural outgrowth of the modern business systems environment, particularly driven by the adoption of SaaS platforms and API-first becoming the new software mantra. Traditionally, RPA has been marketed to work with the complex mix of legacy, third-party and modern business applications that most organizations have accumulated. When delivering an RPA platform, it is nearly always best to use APIs when available, as the combination of these technologies delivers an extensive and change-resistant experience by removing the inherent change-prone UI layer from the equation. "Counter to what some may assume, the existence of an API does not negate the usefulness of RPA," Cottongim said.


A new IOT botnet is infecting Android-based set-top boxes

In a report published today and shared with ZDNet, WootCloud Labs said Ares operates by randomly scanning the internet for Android devices with open ADB ports. When it finds a vulnerable device, the Ares operators download a version of the Ares malware on the exposed device, which then acts as another scanning point for the Ares operators. Ares-infected devices will scan for both other Android systems with open ADB ports, but also for devices running Telnet services, specific to Linux-based servers and smart devices. While Ares operators are obviously trying to infect any device they can, WootCloud said it's seen the botnet infecting set-top boxes from HiSilicon, Cubetek, and QezyMedia. These attacks started in July, Srinivas Akella, Founder & Chief Technology Officer of WootCloud, told ZDNet in an email today. The exec also doesn't exclude the possibility that other types of Android systems were also infected. "To protect against the ADB being misused in these cases where it is left enabled, routers can be configured to block the ingress and egress network traffic to TCP port 5555, which is the ADB port," Akella said.



Quote for the day:


"Enthusiasm is excitement with inspiration, motivation, and a pinch of creativity." -- Bo Bennett


Daily Tech Digest - March 22, 2019

Artificial Intelligence Can Help Or Hurt Any Business


Everyone has heard about the big potential for using artificial intelligence (AI) to expand your business, but many of the small businesses I mentor are still wary of embracing it, because they don’t understand how it works, and fear losing control and unintended consequences. My advice is that AI is here, so it behooves all of us to learn how to use it properly and move forward. For example, it is a no-brainer to first take advantage of the wave of new capabilities for data collection and smarter analysis to improve productivity and marketing. What is not so obvious is how to create and roll out solutions that can directly impact customer trust or financial well-being. There have been too many recent glitches, such as evidence of devices invading our privacy.  To put this all in perspective, I was happy to see the guidance and recommendations on how to deal with artificial intelligence correctly in a new book, “The Big Nine,” by Amy Webb. As a recognized futurist and thought leader in this space, she outlines how the big nine tech titans, including Google, Microsoft, and Alibaba, should be working to solve key long-term issues.



microsoftdatadnastorageautomation.jpg
Researchers at Microsoft and the late Microsoft founder Paul Allen's school of computing science at the University of Washington has built a system of liquids, tubes, syringes, and electronics around a benchtop to deliver the world's first automated DNA storage device.  Using the proof-of-concept DNA storage device, the researchers demonstrated its write and read capabilities by encoding the word 'hello' in snippets of DNA and converting it back to data. The bench-top unit cost around $10,000 but the researchers believe it could be built in low-volumes for a third of the cost by cutting out sensors and actuators. The unit, described in Nature, consists of computers with encoding and decoding software that translate digital ones and zeros into DNA's four bases: A, C, T, G. There's also a DNA synthesis module and a DNA preparation and sequencing module, between which sits a vessel where DNA is stored. Microsoft principal researcher Karin Strauss says the group wanted to prove there is a practical way of automating DNA data storage.


Business leaders disillusioned with business transformation 


“Begin at the beginning, and go on till you come to the end: then stop. That is not same thing, however, as saying that a digital business transformation process should begin without a clear idea of where it is going. Indeed, this is vital. Unfortunately, the Celonis study also found that most organisations are struggling with transformation initiatives because they are diving into execution before understanding what actually needs changing. The research found that 39% of analysts are not basing their work on internal processes when executing the transformation strategy given to them by senior personnel. Celonis suggested that this highlights “that business leaders are investing in transformation initiatives because they think they should and not because they have identified a specific problem.” Businesses are also skipping square one, suggests the report, and are “still jumping straight into tactics.” It gave examples, AI, machine learning and automation. The survey found that 73% of C-suite say that these are areas that they want to maintain or increase investment in. In contrast, a fraction under a third of senior leaders state that they plan to invest more in getting better visibility of their processes.


How tech brings learning and development to deskless employees


"Interestingly, there have been some really significant advancements in brain science, cognitive science," Leaman said. This research looks at how the brain remembers information. "What we know now is that people do what they remember. If they don't remember they will guess. So how do you get people to remember and not guess or just simply not do?" This perspective has shifted learning to the form of micro content, focusing on key learning points that are accessible via a mobile device, just when an employee needs it. Typical use cases include restaurant employees accessing recipe cards, manuals or operational reference material to learn about new promotions, Carr said. Or medical device sales representatives can access and learn about new product information, product launches and new drugs. A fitness club employee can look up the day's workout each morning before teaching it to the class, and a field service tech can look up quick tips on a potential problem before going into a customer's home.


Where Technology Fits in the Employee Experience

two colleagues looking at a computer screen together
Digital transformation, Mike Graham, CEO of Epilogue Systems argues, involves a lot of people over a long period of time. Preparing for staff, budget and time exhaustion — before it happens — is critical to your team and digital transformation. External staff, such as systems integrators and software companies will vanish after the go-live date and internal staff may take themselves out of the project, or completely leave the organization. “Users must be able to effectively adopt the technology themselves in order for digital transformation to reach its full potential,” he said. Think about digital adoption beyond the critical first months. While adoption in the first three to five months after things go-live is critical, it’s a process that’s never complete. Think of all the changes that an application experiences over time: upgrades, shifts in an organization’s application landscape, integrations and APIs, and an increasingly complex digital workplace. Beyond that, there’s also the challenges of the workforce to account for: hiring, turnover, retirement, role changes and business model evolution.


Atlassian, AgileCraft join to scale Agile development


AgileCraft's value stream management technology provides joint visibility for Atlassian's own stack into Azure DevOps Server, Rally and various continuous delivery tools. "What makes AgileCraft interesting is the platform's focus on helping enterprises figure out how to replicate DevOps success by holistically looking at and correlating the business and financial side of things and DevOps process flows," said Torsten Volk, an analyst at Enterprise Management Associates, based in Boulder, Colo. Atlassian's addition of a DevOps analytics platform could replicate the success of one or two high-performing DevOps teams across the entire enterprise, Volk said. "Considering the lack of competition in this arena, I think the $166 million could prove to be money well spent," he said. A key question is whether AgileCraft customers will face pressure to move to the Atlassian stack. Most enterprises use a variety of different products at the teamwork level.


Quantum computing will break your encryption in a few years

meet the bristlecone chip googles 72 qubit quantum computer chip
Quantum computing-based security technology is effective because it relies on two of the best-known properties of quantum physics – the idea that observing a particle changes its behavior, and that paired or “entangled” particles share the same set of properties as the other. What this means, in essence, is that both parties to a message can share an identical cipher key, thanks to quantum entanglement. In addition, should a third party attempt to eavesdrop on that sharing, it would break the symmetry of the entangled pairs, and it would be instantly apparent that something fishy was going on. “If everything is working perfectly, everything should be in sync. But if something goes wrong, it means you’ll see a discrepancy,” said Jackson. It’s like a soap bubble, according to Brian Lowy, vice president at ID Quantique SA, a Switzerland-based quantum computing vendor – mess with it and it pops. “At some point, you’re going to have to factor [quantum computing],” he said, noting that, even now, bad actors could download encrypted information now, planning to crack its defenses once quantum computing is equal to the task.


How to deprecate software features without bothering users

In a situation where the deprecated feature or function will be removed entirely and not replaced, developers should offer suggestions for software layers or tools that can provide a worthwhile alternative, as well as guidance and instruction to help users adopt them. For example, if a custom database is replaced by a third-party database, such as SQL, support staff should ideally help users connect the database to the software and migrate it to the third-party platform. Software deprecation is all about continuity. You must ensure that developers don't alienate the customer base and instead help them through impending product changes to minimalize disruptions for their businesses. Providing continued service requires training for the help desk and support team, as well as helpful documentation in the form of notices, guides and knowledge base entries. Customers use your software to help run their businesses, so you must communicate changes about your product -- especially when you deprecate software features -- well in advance.


AI cloudops is coming, whether you like it or not

AI cloudops is coming, whether you like it or not
The pros are that you can have a 7/24/365 monitoring and management program on the cheap. If you believe operational staff is expensive, try hiring them for shift work. AI-based monitoring and management systems never sleep, never take time off, and never ask for a raise. Once they are up and running, they cost almost nothing beyond their license fees and infrastructure costs. And they are self-learning at the same time; in other words, the more they run, the better that they get at the job. ... One con is that the cost of rolling out these systems is high, even in the cloud. Vendors that have married AI and operational tools are going to charge a premium to get them up and running and in production. While the prices are all over the place, count on paying 50 percent more than for traditional tools, including consulting services for the first year or so to get the tools learning correctly. Another con is that operations people don’t seem to like them no matter how well they perform. The number of passive-aggressive actions that I’ve seen over the years from people pushing back on AI-enabled operations tools has been huge.


How digitalization supplants old insurance models

New technologies are also threatening the long-term viability of credit-based insurance. Carriers are increasingly seeking out data and building predictive models that will prove more powerful and profitable, even during periods of economic volatility. For example, my company, ODN has shown it is possible to extend more policies at more affordable rates to people with poor credit, by pricing risk based on where people drive, rather than who they are. To remain competitive and profitable in the long-term, underwriting and actuarial teams need to pay attention to the dynamics of credit-based insurance today and plan for a future that simply may not include FICO. Carriers should be asking, what will happen to our pricing models if economic conditions or regulators make credit-based insurance irrelevant? What new technologies need to be in place to continue pricing risk and remain competitive in a world without FICO?



Quote for the day:


"Leaders dig into their business to learn painful realities rather than peaceful illusion." -- Orrin Woodward


Daily Tech Digest - November 03, 2018

fiber optics
“It fits the scale of existing fiber technology and could be applied to increase the bandwidth or potentially the processing speed of that fiber by over 100 times within the next couple of years,” RMIT Prof. Min Gu said. “This easy scalability and the massive impact it will have on telecommunications is what’s so exciting.” Fiber isn’t going anywhere. Even if wireless becomes more important, such as in 5G networks, fiber is still needed for backhaul. The school doesn’t say what speed it has gotten or will obtain other than using the 100x figure. But, in part, it’s a new miniaturization of the equipment that’s the big deal. Previous experiments by various academic teams dating back to at least 2013 have involved larger equipment for transmission and decoding. RITT says the former gear would not have been practical for current telco environments. RITT, however, says the newly shrunken spiraling, speed-inducing equipment is nanoscale.



Canada's Mandatory Breach Notification Rules Now in Effect

Canada's Mandatory Breach Notification Rules Now in Effect
Hunton says the OPC has clarified that statement in its final guidance. "In general, when an organization (the 'principal') provides personal information to a third-party processor (the 'processor'), the principal may reasonably be found to be in control of the personal information it has transferred to the processor, triggering the reporting and record-keeping obligations of a breach that occurs with the processor," the law firm notes. "On the other hand, if the processor uses or discloses the same personal information for other purposes, it is no longer simply processing the personal information on behalf of the principal; it is instead acting as an organization 'in control' of the information, and would thereby have the obligation to notify, report, and record." Takeaway: Organizations must assess all breaches on a case-by-case basis, as well as ensure they have the right contractual obligations in place to ensure that any third parties that handle its data take appropriate steps to secure it, Hunton says.


Google says 'exponential' growth of AI is changing nature of compute

img0223.jpg
The demand from the Google Brain team that leads research on AI is for "gigantic machines" said Young. For example, neural networks are sometimes measured by the number of "weights" they employ, variables that are applied to the neural network to shape its manipulation of data. Whereas conventional neural nets may have hundreds of thousand of such weights that must be computed, or even millions, Google's scientists are saying "please give us a tera-weight machine," computers capable of computing a trillion weights. That's because "each time you double the size of the [neural] network, we get an improvement in accuracy." Bigger and bigger is the rule in AI. To respond, of course, Google has been developing its own line of machine learning chips, the "Tensor Processing Unit." The TPU, and parts like it, are needed because traditional CPUs and graphics chips (GPUs) can't keep up. "For a very long time, we held back and said Intel and Nvidia are really great at building high-performance systems," said Young. "We crossed that threshold five years ago."


A time-saving typing tool that works anywhere in Chrome

Chrome Text Expander
The tool is called Text Blaze, and while it's technically still in beta, it's been working incredibly well for me both on Chrome OS and within Chrome on Windows. It's super-easy to set up, too: Once you've installed the extension and connected it to your Google account (which is what allows your snippets to sync automatically and always be available on any device where you're signed in), you just open your dashboard — by clicking the Text Blaze icon in your browser's address bar or by visiting this link — and there, you can create and manage all of your text replacement snippets. Creating a new snippet is as simple as clicking the blue "+" button in the upper-left corner of the screen. You can also edit any existing snippet (including a series of sample snippets provided when you first install the program) by clicking its title in the "My Snippets" column on the screen's left side.


How 5G aims to end network delays that slow everything down


There's evidence 5G is getting the promised low latency links. "We are between 1 to 2 milliseconds," Rygaard said of Nokia's tests of latency between phones and cell towers. A millisecond is a thousandth of a second, about the time a baseball is in contact with a bat that's hitting it. There will be other delays in the system, such as software actually doing something with the data that's traversing the network, but the 5G fundamentals appear to be in place. "We're seeing the very low single digit milliseconds," Fuetsch said. That's more than the 1-millisecond latency goal 5G proponents have sought for years, but it also includes communications deeper into the network, not just between a phone and cell tower. And it's a big improvement over today's 4G networks with latencies more than 10 times slower, according to real-world measurements from mobile analytics company OpenSignal. On top of that, future versions of 5G will be able to guarantee that latency.


Thousands Of Swedes Are Inserting Microchips Under Their Skin


So many Swedes are lining up to get the microchips that the country's main chipping company says it can't keep up with the number of requests. More than 4,000 Swedes have adopted the technology, with one company, Biohax International, dominating the market. The chipping firm was started five years ago by Jowan Osterlund, a former professional body piercer. After spending the past two years working full time on the project, he is currently developing training materials so he can hire Swedish doctors and nurses to help take on some of his heavy workload. "Having different cards and tokens verifying your identity to a bunch of different systems just doesn't make sense," he says. "Using a chip means that the hyper-connected surroundings that you live in every day can be streamlined." ... "I see no problem for [it] becoming mainstream. I think it's something that can seriously make people's lives better," Varszegi says.


free wifi secure network public wifi chain links
WPA3 provides improvements to the general Wi-Fi encryption, thanks to Simultaneous Authentication of Equals (SAE) replacing the Pre-Shared Key (PSK) authentication method used in prior WPA versions. This allows for better functionality so WPA3-Personal networks with simple passphrases aren’t so simple for hackers to crack using off-site, brute-force, dictionary-based cracking attempts like it was with WPA/WPA2. Of course, it will still be just as easy for someone to guess a very simple password when they’re attempting to directly connect to the Wi-Fi with a device, but that’s a less practical cracking method. The encryption with WPA3-Personal is more individualized. Users on a WPA3-Personal network can’t ever snoop on another’s WPA3-Personal traffic, even when the user has the Wi-Fi password and is successfully connected. Furthermore, if an outsider determines the password, it is not possible to passively observe an exchange and determine the session keys, providing forward secrecy of network traffic.


HHS Tries Again: New Cyber Coordination Center Launched

HHS Tries Again: New Cyber Coordination Center Launched
H-ISAC President Denise Anderson tells ISMG her organization will continue to closely collaborate with HHS on information sharing. "The H-ISAC has been actively engaged with the HCCIC and now the HC3," she says. "We will continue to work with HHS as well as our other strategic partners in government and industry to support the sector with situational awareness, threat mitigation and incident response." The ability of HHS to respond to cyber incidents is critically important, and in the past year has been limited, says Jim Routh, chief security officer at health insurer Aetna and an H-ISAC board member. "Coordination across the sector in collaboration with DHS is essential and represents an opportunity for continuous improvement. This [HHS] announcement represents a step forward, but the healthcare sector needs more maturity in capability. The H-ISAC has always and will always support the HHS commitment toward cyber incident response." HITRUST, best known for its Common Security Framework, also has been working for several years with the federal government


The first is learning to think like a data scientist. We don’t speak about this often enough, but it is really hard to acquire good data, analyze it properly, follow the clues those analyses offer, explore the implications, and present results in a fair, compelling way. This is the essence of data science. You can’t read about this in a book — you simply have to experience the work to appreciate it. To give your team some hands-on practice, charge them with selecting a topic of their own interest (such as “whether meetings start on time”) and then have them complete the exercise described in this article. The first step will lead to a picture similar to the one below, and the rest of the exercise involves exploring the implications of that picture. ... The third skill is conducting a root cause analysis (RCA) and its pre-requisite, understanding the distinction between correlation and causation. Studying the numbers can point to where most errors occur or demonstrate that two (or more) variables go up and down in tandem, but it cannot fully describe why this is. 


How to set yourself apart in the future of work

There’s no way we can think as quickly, or efficiently, as a computer if the primal part of our brain–the amygdala–directs us toward protectionism. Fear has been shown to impair function in the hippocampus, a vital part of the brain that helps regulate mood and memory. It is also key to creative function. Any negative impact on this part of the brain is bad news for your career and can cause you to limit the very qualities that will be key to career resilience in the future. The experience of using voice recognition software can feel more sci-fi than AI, but it’s helpful to remember that AI is still pretty narrow. A robot that can perform surgery can’t make you a coffee. Even the most sophisticated AI cannot answer the question “is this a cat?” whereas a human toddler would know in an instant. If you’re still feeling skeptical, google the dog versus muffin test. You’d have no problem spotting the difference.



Quote for the day:


"If you think you're leading and no one is following you, then you're only taking a walk." -- Afghan Proverb