Showing posts with label DesignOps. Show all posts
Showing posts with label DesignOps. Show all posts

Daily Tech Digest - August 01, 2022

4 fundamental practices in IoT software development

One of the greatest concerns in IoT is security, and how software engineers address it will play a deeper role. As devices interact with each other, businesses need to be able to securely handle the data deluge. There have already been many data breaches where smart devices have been the target, notably Osram, which was found to have vulnerabilities in its IoT lightbulbs, potentially gifting an attacker access to a user’s network and the devices connected to it. Security needs to be tackled at the start of the design phase, making requirement tradeoffs as needed, rather than adding as a mere ‘bolt on’. This is highly correlated to software robustness. It may take a little bit more time to design and build robust software upfront, but secure software is more reliable and easier to maintain in the long run. A study by CAST suggests that one-third of security problems are also robustness problems, a finding that is borne out in our field experience with customers. Despite software developers’ best intentions, management is always looking for shortcuts. In the IoT ecosystem, first to market is a huge competitive driver, so this could mean that security, quality and dependability are sacrificed for speed to release. 


Accountability in algorithmic injustice

Often, journalists fixate on finding broken or abusive systems, but miss out on what happens next. Yet, in the majority of cases, little to no justice is found for the victims. At most, the faulty systems are unceremoniously taken out of circulation. So, why is it so hard to get justice and accountability when algorithms go wrong? The answer goes deep into the way society interacts with technology and exposes fundamental flaws in the way our entire legal system operates. “I suppose the preliminary question is: do you even know that you’ve been shafted?” says Karen Yeung, a professor and an expert in law and technology policy at the University of Birmingham. “There’s just a basic problem of total opacity that’s really difficult to contend with.” The ADCU, for example, had to take Uber and Ola to court in the Netherlands to try to gain access to more insight on how the company’s algorithms make automated decisions on everything from how much pay and deductions drivers receive, to whether or not they are fired. Even then, the court largely refused their request for information. Further, even if the details of systems are made public, that’s no guarantee people will be able to fully understand it either – and that includes those using the systems.


Data Mesh: To Mesh or not to Mesh?

Data Mesh allows teams to curate/generate data and create usable data products for other teams. It also makes certain that platform teams can put their efforts into data engineering while data professionals can handle domain-specific data issues. While business data professionals are responsible for the quality and reliability of the data their teams produce, they can take assistance from platform teams in the face of technical glitches. Apart from that, Data Mesh design is inclined towards business users and requires relatively minor interference from platform teams. This is unlike centralized data teams that are responsible for everything, from data frameworks and access to dealing with data-related requests. To conclude, Data Mesh or the decentralized architecture encourages each party to excel in their area of expertise. The platform teams need to focus on technology, engineering, and data pipelines, while the data professionals are accountable for ensuring data quality. This holistic approach ensures end-users can perform their tasks by leveraging data insights without investing time in acquiring the results of a custom request.


Chase CIO details what entry-level job-seekers need to succeed in Fintech

Never stop learning. The skills you mastered a few years ago may be no longer relevant today, which is why it’s important to be open to constantly learning. Whether you are starting your career or have years of experience, take it upon yourself to learn new skills and technologies. ... The skills required to be a technologist have evolved, but also the ways with colleagues across lines of business. One change we’ve really embraced as an organization is embarking on an agile and product transformation. We’ve taken advantage of the opportunity that came with the changing behaviors of consumers over the past few years to really embrace agile at a different scale. This matters tremendously, because when we deploy code or build an entirely new product, it helps millions of consumers reach their financial goals. The pace of change has accelerated, but the focus on making it easier for our customers to bank with Chase has not. Today, we’ve reorganized ourselves away from project-based teams into product-based teams. Each product now has a dedicated tech, product, design, and data & analytics leader to help speed up decision making and improve connectivity and collaboration.


Attacks using Office macros decline in wake of Microsoft action

"It's a hugely important step Microsoft is taking to start blocking these macros by default, especially due to how invisible macros are to the majority of users," adds Nathan Wenzler, chief security strategist at Tenable, a vulnerability scanning company. "But that doesn't mean the threat is eradicated or we shouldn't continue to remind users to be vigilant about opening files from untrusted sources." Other companies are seeing threat actors switching tactics because of Microsoft's move, too. "The adversaries are aware of it," observes Tim Bandos, executive vice president of cybersecurity at Xcitium, a maker of an endpoint security suite. "They're testing out new ways of working around it because they're clearly not as successful now that Microsoft has made this change." Users of one notorious malicious program, known as Emotet, have already begun shifting tactics, he notes. "We've seen them shift recently from leveraging macros to using URLs to OneDrive or Google Drive," he says.


Solana blockchain and the Proof of History

The consensus mechanism is a fundamental characteristic and differentiator among blockchains. Solana's consensus mechanism has several novel features, in particular the Proof of History algorithm, which enables faster processing time and lower transaction costs. How PoH works is not hard to grasp conceptually. It's a bit harder to understand how it improves processing time and transaction costs. The Solana whitepaper is a deep dive into the implementation details, but it can be easy to miss the forest for the trees. Conceptually, the Proof of History provides a way to cryptographically prove the passage of time and where events fall in that timeline. This consensus mechanism is used in tandem with another more conventional algorithm like the Proof of Work (PoW) or Proof of Stake (PoS). The Proof of History makes the Proof of Stake more efficient and resilient in Solana. You can think of PoH as a cryptographic clock. It timestamps transactions with a hash that guarantees where in time the transaction occurred as valid. This means the entire network can forget about verifying the temporal claims of nodes and defer reconciling the current state of the chain.


Why and How our AI needs to understand causality

Introducing causality to machine learning can make the model outputs more robust, and prevent the types of errors described earlier. But what does this look like? How can we encode causality into a model? The exact approach depends on the question we are trying to answer and the type of data we have available. ... They trained the model to ask “if I treat this disease, which symptoms would go away?” and “if I don’t treat this disease, which symptoms would remain?”. They encoded these questions as two mathematical formulae. Using these questions brings in causality: if treating a disease causes symptoms to go away, then it’s a causal relationship. They compared their causal model with a model that only looked at correlations and found that it performed better — particularly for rarer diseases and more complex cases. Despite the great potential of machine learning, and the associated excitement, we must not forget our core statistical principles. We must go beyond correlation (association) to look at causation, and build this into our models. 


Cyberattack prevention is cost-effective, so why aren’t businesses investing to protect?

To measure the success of an investment, you first need to quantify the cost of what you’re trying to protect. In a simplified model, the first step is to measure the given benefits of protection, this starts with an asset valuation. How valuable is this data to me? Those in charge of the budget need to execute the risk of that data not being protected. If I don’t take the necessary measures to mitigate the risk by investing in preventative cyber-security tools, how costly could this be when a breach occurs? It is more cost-effective to validate an organisation’s controls rather than spending money on more tools. By adopting specialised frameworks to counteract cyber threats, for instance, running a threat-informed defence, utilising automated platforms such as Breach-and-Attack Simulation (BAS), CISO’S can continuously test and validate their system. Similar to a fire drill, BAS can locate which controls are failing, allowing organisations to remediate the gaps in their defence, making them cyber ready before the attack occurs.


Cyber Resiliency: How CIOs Can Prepare for a Cloud Outage

Beyond security issues, cloud outages can open the door to cascading disruptions affecting both routine business and mission-critical applications. “This can lead to [issues] ranging from revenue loss to more serious impacts -- such as putting lives at risk in the case of critical health care applications,” explains Ravikanth Ganta, a senior director at business consulting firm Capgemini Americas. A cloud outage’s seriousness hinges on several factors, including organization preparedness, the zone regions affected, and the services impacted. “In many cases, businesses that build and run their applications in the cloud can endure a cloud outage with little to no impact if they architect their applications to take advantage of the automated failover capabilities readily available in the cloud,” Potter notes. Modular applications designed to leverage loosely coupled services will typically experience only a minor drop in availability or performance during a vendor outage and, in many cases, may not be affected all. “Customers that ... haven’t architected their applications to gracefully failover or redirect traffic to unimpacted zones or regions, will face greater availability challenges when a cloud provider experiences an outage,” Potter says.


Why DesignOps Matters: How to Improve Your Design Processes

“A foundational aspect of DesignOps is the adoption of agile work breakdown structures (WBSs) to organize UX work from alignment with broad strategic objectives to screen-level details in a single EAP tool. While this feels foreign to most UX practitioners at first, agile WBS maps quite well to UX work. The business and operational benefits of this approach are profound, including more accurate plans, estimates, tracking and reporting.” With a single working environment for managers, designers, developers, and even stakeholders as part of the DesignOps strategy, everyone can easily align their work and tasks, test and comment on prototypes in real time, eliminate design handoffs, reduce costly iterations, keep track of progress and identify bottlenecks. ... There’s no such thing as a designer who can handle every process and task because in the end, they do everything but the actual design. Digital product design is a multi-layered job that requires different experienced units in particular fields. Just as there is a need for a separation between UX and UI design with two distinct experts handling each, there is a need for a dedicated DesignOps person.



Quote for the day:

"The task of the leader is to get his people from where they are to where they have not been." -- Henry A. Kissinger

Daily Tech Digest - February 27, 2020

Unpatched Security Flaws Open Connected Vacuum to Takeover

iot robot vacuum cleaner
Researchers have discovered several high-severity vulnerabilities in a connected vacuum cleaner. The security holes could give remote attackers the capability to launch an array of attacks — from a denial of service (DoS) attack that renders the vacuum unusable, to viewing private home footage through the vacuum’s embedded camera. The Ironpie M6, which is available for $230 on Amazon, comes equipped with a corresponding mobile app and a security camera. The vacuum cleaner is built by artificial intelligence home robot company Trifo, and was first launched IronPie at CES 2019. Researchers on Wednesday said that they uncovered six flaws, stemming from the vacuum’s mobile app and its connectivity protocol, at RSA Conference 2020, this week in San Francisco. “The most severe vulnerability allows attackers to access any video stream from any Trifo device across the world,” Erez Yalon, director of security research with Checkmarx, told Threatpost. “Through this vulnerability, every single user – whether in a home or office setting as shown in our PoC video – is at risk of a hacker obtaining a live video feed. Needless to say, this represents a total loss of privacy.”


The Amazing Ways Goodyear Uses Artificial Intelligence And IoT For Digital Transformation

The Amazing Ways Goodyear Uses Artificial Intelligence And IoT For Digital Transformation
Regardless if it's an autonomous, electric, or a traditional vehicle, they all need a solid foundation of the right tire for the specific demands of the vehicle. Goodyear uses internet of things technology in its Eagle 360 Urban tire. The tire is 3D printed with super-elastic polymer and embedded with sensors. These sensors send road and tire data back to the artificial intelligence-enhanced control panel that can then change the tread design to respond to current road conditions on the fly and share info about conditions with the broader network. If the tire tread is damaged, the tire moves the material and begins self-repair. Goodyear’s intelligent tires are in use on a new pilot program with Redspher, a European transportation and logistics company operating in 19 countries. The fleet benefits from the tire's ability to monitor and track tire pressure, vehicle data, and road conditions. This data is then analyzed by Goodyear’s algorithms to gain insights about maintenance needs and ways to improve the safety and performance of the fleet.


Google Teaches AI To Play The Game Of Chip Design


One of the promising frontiers of research right now in chip design is using machine learning techniques to actually help with some of the tasks in the design process. We will be discussing this at our upcoming The Next AI Platform event in San Jose on March 10 with Elias Fallon, engineering director at Cadence Design Systems. The use of machine learning in chip design was also one of the topics that Jeff Dean, a senior fellow in the Research Group at Google who has helped invent many of the hyperscaler’s key technologies, talked about in his keynote address at this week’s 2020 International Solid State Circuits Conference in San Francisco. Google, as it turns out, has more than a passing interest in compute engines, being one of the large consumers of CPUs and GPUs in the world and also the designer of TPUs spanning from the edge to the datacenter for doing both machine learning inference and training. So this is not just an academic exercise for the search engine giant and public cloud contender – particularly if it intends to keep advancing its TPU roadmap and if it decides, like rival Amazon Web Services, to start designing its own custom Arm server chips or decides to do custom Arm chips for its phones and other consumer devices.


JFrog touts DevSecOps edge in CI/CD tools


Most CI/CD tools integrate with package managers for similar purposes. But JFrog could differentiate its Pipelines product based on its experience developing the Artifactory artifact repository manager, as well as its messaging. "Everyone is really doing the same thing -- transforming code into software packages and then shipping those packages to production," said Tom Petrocelli, an analyst at Amalgam Insights. "But there are security advantages as a side effect of the way [JFrog thinks]." This relates to the fact that enterprise DevOps shops in the Linux world increasingly use package managers to centralize corporate governance, explained Charles Betz, an analyst at Forrester Research. "There's a heck of a lot of digital management that revolves around artifacts when you don't own the source code, when that code is written by open source communities and vendors," Betz said.


Hidden cost of cloud puts brakes on migration projects


More than half (58%) of the IT decision-makers surveyed believe the cloud over-promised and under-delivered, while 43% admit that the cloud is more costly than they thought. Only 27% of IT decision-makers surveyed claim they have been able to reduce labour and logistical costs by moving to the cloud. Mark Cook, divisional executive officer at Capita, said: “Every migration journey is unique in both its destination and starting point. While some organisations are either ‘born’ digital or can gather the resources to transform in a relatively short space of time, the majority will have a much slower, more complex path. “Many larger organisations will have heritage technology and processes that can’t simply be lifted and converted, but will need some degree of ‘hybrid by design’,” he added. When asked what unforeseen factors had delayed cloud migration projects, 39% had cost as the main factor, followed by workload and application rearchitecting issues (38%) and security concerns (37%).


IoT Can Put Your Data at Risk, Here’s How

ai and big data
The data processed by IoT devices is potentially extremely sensitive. With office and home security systems increasingly mediated by IoT (doorbells and surveillance cameras being just a couple of examples), criminal attacks can pose a serious problem. The huge volume of data habitually collected by IoT devices was exposed this year when a database owned by the Chinese firm Orvibo, who offer a smart home appliance platform, was found to have no password protection despite containing logs relating to 2 million worldwide users, including individuals and hotel chains. The data included insufficiently-protected user passwords, reset codes, precise locations, and even a recorded conversation. Botnets are another way for cybercriminals to wreak havoc using IoT devices. Botnets consist of, as their name suggests, networks of bots running on Internet-connected devices. They are primarily known for their role in DDoS (Distributed Denial of Service) attacks, in which a stream of network requests is sent to a network that a malicious entity wishes to bring down.


DesignOps — scaling design to create a productive internal environment for IBMers

DesignOps — scaling design to create a productive environment for IBMers image
DesignOps is a collective term for creating a productive workforce, by addressing challenges such as: growing and evolving design teams, finding and hiring people with the right skills, creating efficient workflows and improving the quality and impact of design outputs. It’s a method of optimising people, processes and workflow, and at IBM, the practice has been deployed to increase efficiency, productivity and general well-being among the the whole organisation, including the thousands-strong IT team. Satisfying this level of individuals and teams is no easy feat, which is why IBM has a specific department dedicated to creating great experiences for IBMers. Kristin Wisnewski — who is on the advisory board for Information Age’s Women in IT Summit in New York on March 25th 2020 at the Grand Hyatt Hotel — leads the CIO Design team at IBM as vice president, whose purpose is to create a productive internal environment at IBM. “We’re here to create, design and improve the experience of employees in their daily jobs. Our team is made up of 140 people, and so it is a big mission to help the hundreds of thousands of employees here at IBM,” she said.


Cloud misconfigurations are a new risk for the enterprise

businessman touching Cloud with Padlock icon on network connection, digital background. Cloud computing and network security concept
Cloud misconfigurations are becoming another risk for corporations. At RSA 2020, Steve Grobman, senior vice president and chief technology officer at McAfee, explained how easy it is to take advantage of cloud misconfigurations, an expensive security problem for corporations. He compared cyber security to infectious disease control: an imperfect science. ... In addition to making sure cloud configurations are secure, security teams have to address tomorrow's security risks today, Grobman said. Advances in quantum computing will be a double-edged sword with the downside being the threat to existing encryption systems. "Nation-states will use quantum computing to break our public key encryption systems," he said. "Our adversaries are getting the data today and counting on quantum to unlock in tomorrow." Grubman said that companies need to think about how long data will need to be protected. "Even in 2020, there are documents in the National Archives in relation to the Kennedy assignation that still have redacted information due to national security concerns of today," he said.


Data Science Is A Team Sport: Oracle’s New Cloud Platform Provides The Playing Field

Data Science
Unlike other data science products that focus on helping individual data scientists, Oracle Cloud Infrastructure Data Science helps improve the effectiveness of data science teams with capabilities like shared projects, model catalogs, team security policies, and reproducibility and auditability features. “Data scientists are experimenters. They want to try stuff and see how it works,” says Pavlik. “They grab sample datasets, they pull in all kinds of open source tools, and they're doing great stuff. What we want to do is let them keep doing that, but improve their productivity by automating their entire workflow and adding strong team support for collaboration to help ensure that data science projects deliver real value to businesses.” The starting point for data science to deliver value is doing more with machine learning, and being more efficient with the data and algorithms involved.  “Effective machine learning models are the foundation of successful data science projects,” Pavlik says, but the volume and variety of data facing data science teams “can stall these initiatives before they ever get off the ground.”


Getting closer to no-battery devices

Iot
The technique being exploited takes advantage of backscattering. That's a way of parasitically using radio signals inherent in everyday environments. In this case, the chip piggybacks on existing Wi-Fi transmissions to send its data. This method of sending data is power-light, because the carrier needed for the radio transmission is already created—it doesn’t need new energy for the message to be sent. Interestingly, two principal scientists involved in this backscattering project, which was announced by UC San Diego's Jacobs School of Engineering, have also been heavily involved in the development of "wake-up" radios. Wake-up is when a Wi-Fi or other radio comes alive to communicate only when it has something to transmit or receive. The technology uses two radios. One radio is for the wake-up signaling; that radio's only purpose is to listen for a signature. The second is a more heavy-duty radio for the data send. Power is saved because the main radio isn't on all the time. Dinesh Bharadia, now a professor of electrical and computer engineering at UC San Diego, was at Stanford University working on a wake-up radio that I’ve written about.



Quote for the day:


"The greatest good you can do for another is not just share your riches, but reveal to them their own." -- Benjamin Disraeli