Showing posts with label SDN. Show all posts
Showing posts with label SDN. Show all posts

Daily Tech Digest - June 15, 2023

The five new foundational qualities of effective leadership

Today’s leaders have to be able to establish a compelling destination and then navigate through the fog with a compass. “You have to be ready to make a decision today, realizing that you may get new data tomorrow that means you have to reverse the decision you just made,” a veteran CEO of a Fortune 25 company told us. “You have to have the courage to follow that new information. The job’s always been ambiguous. But the environment has never been this fluid.” Boards and CEOs expect succession candidates to be adept at providing direction and key performance indicators that will signal whether course adjustments are necessary. “We’re living in an age with many more discontinuities than we had a generation or two ago,” said Mark Thompson, former CEO of the New York Times Company and now board chairman of Ancestry. “It’s not about trying to find the perfect strategies. It’s more about helping organizations to be more open, flexible, and adaptable to change.” This shift demands a more dynamic, individual leadership approach, as well as a reimagining of basic organizational processes. 


5 best practices to ensure the security of third-party APIs

Maintaining an API inventory that automatically updates as code changes is an instrumental first step for an API security program, says Jacob Garrison, a security researcher at Bionic. This is an instrumental first step for an API security program; it should distinguish between first-party and third-party APIs. And it encourages continuous monitoring for shadow IT — APIs brought on board without notifying the security team. “To ensure your inventory is robust and actionable, you should track which APIs transmit business-critical information, such as personally identifiable information and payment card data,” he says. An API inventory is complementary to third-party risk management, according to Garrison. When developers utilize third-party APIs, it’s worthwhile to consider risk assessments of the vendors themselves. ... Frank Catucci, chief technology and head of security research for Invicti Security, agrees that including an inventory of third-party APIs is critical. "You need to have third-party APIs be part of your overall API inventory and you have to look at them as assets that you own, that you are responsible for," he says


Generative AI’s change management challenge

“The hardest part of AI acceptance is creating a space where employees can still add value and not feel they are competing with AI to create value,” Bellefonds added. “A lot of the work we do when it comes to change management and coaching is to help employees work with AI and at the same time, change the way they add value, so that a part of their job is taken by AI but their part refocuses on higher value-adding tasks.” Exactly how those processes are rewired and the working methods changed will vary from one enterprise to another, he said. There are other ways in which employees’ concerns about AI is unevenly distributed, too. Leaders are more likely to be optimistic, and frontline workers concerned, BCG found. And while 68% of leaders believe their companies have implemented adequate measures to ensure responsible use of AI, only 29% of their frontline employees feel that way. Despite BCG’s findings of optimism in the workforce, there’s a darker side. Over one-third of respondents think their job is likely to be eliminated by AI, and almost four-fifths want governments to step in and deliver AI-specific regulations to ensure it’s used responsibly.


As Machines Take Over — What Will It Mean to Be Human?

Biocomputing is a field of study that uses biologically-based molecules, such as DNA or proteins, to perform computational tasks. Imitating the genius of nature can completely shift the paradigm of understanding when it comes to the computation and storage of data. The field has shown promise in cryptography and drug discovery. However, biocomputers are still limited compared to non-bio computers since they aren't good at cooling themselves and doing more than two things simultaneously. Advancements in AI, however, have been booming. Since 2012, interest in AI, especially in machine learning, has been renewed, leading to a dramatic increase in funding and investment. Machine learning models ingest large amounts of data and infer patterns. More recently, generative AI has become extremely popular with the release of large AI models such as MidJourney, ChatGPT and Stable Diffusion. Generative AI is a class of AI algorithms that generate new data or content extremely similar to existing data, nearly identical to human-made data.


What is SDN and where is is going?

There are three main components to a software-defined network: controller, applications, and devices. The controller has taken over the role of the control plane on each individual network device. It populates the tables that the data planes on those devices use to do their work. There are various communication protocols that can be used for this purpose, including OpenFlow, though some vendors use proprietary protocols. Communication between the controller and devices is referred to as southbound APIs. The software controller is, in turn, managed by applications, which can fulfill any number of network administration roles, including load balancers, software-defined security services, orchestration applications, or analytics applications that keep tabs on what's going on in the network. These applications communicate with the controller (northbound APIs) through well-documented REST APIs that allow applications from different vendors to communicate with ease. 


Using Trauma-Informed Approaches in Agile Environments

Software is, by definition, very abstract. For this reason, we naturally tend to be in our heads and thoughts most of the time while at work. However, a more trauma-informed approach requires us to pay more attention to our physical state and not just to our brain and cognition. Our body and its sensations are giving us many signs, vital not just to our well-being but also to our productivity and ability to cognitively understand each other and adapt to changes. Paradoxically, in the end, paying more attention to our physical and emotional state gives us more cognitive resources to do our work. Noticing our bodily sensations at the moment, like breath or muscle tension in a particular area, can be a first step to getting out of a traumatic pattern. And a generally higher level of body awareness can help us fall less into such patterns in the first place. Simplified - our body awareness anchors us in the here and now, making it easier for us to recognize past patterns as inadequate for the current situation.


How Pyramid Thinking Can Revolutionize Your Data Strategy

Before devising a corporate data strategy, the main things you need to know are the strategy and objectives of your organization as a whole. Data can be a truly transformative tool, but even the sharpest knife needs to be used accurately to get the best results -- which is why you need to know the end goal before you can understand how data can help you achieve it. This end goal forms the very peak of the pyramid and it is by looking downwards from it that you can understand the role that data can play. For organizations struggling to pinpoint that goal (as oftentimes happens when the business strategy isn’t well-defined and documented), it is worth considering key business problems and the consequent opportunities for improvement. ... Identifying business goals gives you the basis upon which to build your data strategy, and with that you can begin to be more specific about the change you are looking to make. An actionable and measurable formula helps you shape those changes with clarity, such as “we want to do x by measuring/tracking/analyzing y in order to do z.”


Network spending priorities for second-half 2023

Security is the area where most users expect to spend more, but at the same time an area where they believe their spending is most likely to be sub-optimal. Three-quarters of buyers think they already spend too much on security because they’ve layered things on without considering the whole picture. You hear terms like “holistic approach” or “rethinking” a lot in their comments, but at the same time, less than an eighth of the users expect to redo their security strategies in any way.  ... The reasons for the seemingly mindless AI enthusiasm is a simple reversal of an old saying: “Where there’s hope, there’s life.” AI could (theoretically) reduce operator errors. It could (hopefully) improve network capacity planning. It could (presumably) help secure applications and data and spot malefactors. All these things are recurring problems that seem to defy solution, and AI offers a hope that a solution might be near at hand. What’s not to love, provisionally of course.


Biodiversity Means Business

Technology can play a key role in navigating biodiversity issues. Predictive analytics, machine learning, digital twins, blockchain and the Internet of Things can deliver insight, visibility and measurability into sourcing, supply chains and environmental impacts. However, Katic emphasizes that these tools must be used to drive real change. “They must support a paradigm shift to new, sustainable models of development, rather than entrenching business as usual. They must deliver enhanced transparency and accountability,” she says. Ultimately, companies must imbed biodiversity deep into their business strategies and daily operations, Katic says. This includes the use of science based methods that revolve around the UN’s Sustainable Development Goals and its Global Biodiversity Framework. It can also incorporate tools such as the S&P’s scoring system, part of its UN-linked GlobalSustainable1 initiative, which provides dependency scores, ecosystem footprint insights, and other biodiversity data that can guide decision-making. In addition, the SBTN framework can serve as a valuable resource. More than 200 organizations helped shape the initial set of methods, tools, and guidance.


5 roadblocks to Rust adoption in embedded systems

Rust is not a trivial language to learn. While it does share common ideas and concepts with many of the languages that came before it, including C, the learning curve is steeper. When a company looks to adopt a new language, they hire engineers who already know the technology or are forced to train their team. Teams interested in using Rust for embedded will find themselves in a small, niche community. Within this community, not many qualified embedded software engineers know Rust. That means paying a premium for the few developers who know Rust or investing in training the existing internal team. Training a team to use Rust isn’t a bad idea. Every company and developer should be investing in themselves constantly. Our field changes so rapidly that you’ll quickly get left behind if you don’t. However, switching from one programming language to another must provide a return on investment for the company. Especially when switching to an immature language like Rust. 



Quote for the day:

"Don't focus so much on who is following you, that you forget to lead." -- E'yen A. Gardner

Daily Tech Digest - September 15, 2022

AI is playing a bigger role in cybersecurity, but the bad guys may benefit the most

“Security experts have noted that AI-generated phishing emails actually have higher rates of being opened — [for example] tricking possible victims to click on them and thus generate attacks — than manually crafted phishing emails,” Finch said. “AI can also be used to design malware that is constantly changing, to avoid detection by automated defensive tools.” Constantly changing malware signatures can help attackers evade static defenses such as firewalls and perimeter detection systems. Similarly, AI-powered malware can sit inside a system, collecting data and observing user behavior up until it’s ready to launch another phase of an attack or send out information it has collected with relatively low risk of detection. ... But Finch said, “Given the economics of cyberattacks — it’s generally easier and cheaper to launch attacks than to build effective defenses — I’d say AI will be on balance more hurtful than helpful. Caveat that, however, with the fact that really good AI is difficult to build and requires a lot of specially trained people to make it work well. Run of the mill criminals are not going to have access to the greatest AI minds in the world.”


Cybersecurity’s Too Important To Have A Dysfunctional Team

With such difficulty recruiting and maintaining staff, one option businesses should consider is training and reskilling programmes for existing staff to help bridge the gap. Current cybersecurity professionals can solidify what they already know and stay up to date on the latest learnings. Along with cybersecurity professionals, other technology professionals can be trained and recruited into these roles. Technology professionals are likely to have an affinity for the types of skills needed to succeed in cybersecurity. Non-technical people by background, may still be able to learn what is needed to perform in these roles, especially if businesses are willing to invest and cover the cost of the training. When there is a skills shortage, as is currently the case, and when vacancies outstrip the available talent, organisations need to be prepared to be imaginative in finding solutions. Alongside this, arming all teams, regardless of their skills and experience, with the right tools and support is essential. Working with knowledgeable and trusted partners can help outsource some of the work and offset any skills gaps as the external partner becomes an extension of the in-house team.


How Sweden goes about innovating

The innovation agency functions much like its counterparts in other countries, similarly to the Finnish Funding Agency for Technology and Innovation (Tekes) in neighbouring Finland, and to the part of the US National Science Foundation (NSF) that does seed funding on the other side of the Atlantic. The Swedish government gives Vinnova more than €300m each year to invest through grants to different kinds of actors, which might be small companies, research institutes, large competence centres, or consortia of companies working together on projects. Vinnova invests this money along 10 different themes, including sustainable industry and digital transformation. To report on the social and economic effects of its funding, the agency produces two impact studies annually. It has also published a document that describes its approach to tracking the impact of investments. “It’s never the case that we’re alone in the responsibility for success or failure,” says Göran Marklund, head of strategic intelligence and deputy director-general at Vinnova. 


Bringing AI to inventory optimization

Chasing today’s consumer patterns is a losing game, he believes. “It’s important to take a long-term view so that the next time the pattern shifts, you’ll be ready,” he said. The antuit.ai solution works by combining the historical data that supply chains have always used as well as new data becoming available, doing it at a scale perhaps not previously used, and then utilizing emerging technologies like AI and machine learning to process that data, make decisions and then learn from the execution of those decisions. “If I’m a retailer buying from CPG companies to service hundreds of stores, I have to make inventory decisions such as what port to land, what distribution centers to send it to, how to allocate it to the stores down to the shelf level and at what price to sell it,” Lakshmanan explained. “Part of my data equation is knowing what has historically sold, at what price, what promotions I ran, how much inventory did I have and whether there were any external factors, like was it raining. Now, if I know it’s going to rain next week, I have backward and forward-looking data that I can put through an algorithm to determine things like what is the likely demand at a store in Plano, Texas.”


Ambient computing has arrived: Here's what it looks like, in my house

Ambient computing is ignorable computing. It's there, but it's in the background, doing the job we've built it to do. One definition is a computer you use without knowing that you're using it. That's close to Eno's definition of his music -- ignorable and interesting. A lot of what we do with smart speakers is an introduction to ambient computing. It's not the complete ambient experience, as it relies on only your voice. But you're using a computer without sitting down at a keyboard, talking into thin air. Things get more interesting when that smart speaker becomes the interface to a smart home, where it can respond to queries and drive actions, turning on lights or changing the temperature in a room. But what if that speaker wasn't there at all, with control coming from a smart home that takes advantage of sensors to operate without any conscious interaction on your part? You walk into a room and the lights come on, because sensors detect your presence and because another set of sensors indicate that the current light level in the room is lower than your preferences.


Most enterprises looking to consolidate security vendors

Cost optimization should not be a driver, Gartner VP analyst John Watts said. Those looking at cutting costs must reduce products, licenses and features, or ultimately renegotiate contracts. A drawback of those pursuing consolidation has been a reduction of risk posture in 24% of cases, rather than an improvement. But if cost savings becomes a result of consolidation, CISOs can invest that on preventing attack surface expansion. “This trend captures a dramatic increase in attack surface emerging from changes in the use of digital systems, including new hybrid work, accelerating use of public cloud, more tightly interconnected supply chains, expansion of public-facing digital assets and greater use of operational technology (cyber physical systems—CPS). Security teams may need to expand licensing, add new features, or point solutions to address this trend,” Watts says to CSO. The time invested should also not be taken for granted. Gartner found that vendor consolidation can take a long time with nearly two-thirds of organizations saying they have been consolidating for three years.


Software-defined perimeter: What it is and how it works

An SDP is specifically designed to prevent infrastructure elements from being viewed externally. Hardware, such as routers, servers, printers, and virtually anything else connected to the enterprise network that are also linked to the internet are hidden from all unauthenticated and unauthorized users, regardless of whether the infrastructure is in the cloud or on-premises. "This keeps illegitimate users from accessing the network itself by authenticating first and allowing access second," says John Henley, principal consultant, cybersecurity, with technology research advisory firm ISG. "SDP not only authenticates the user, but also the device being used. When compared with traditional fixed-perimeter approaches such as firewalls, SDP provides greatly enhanced security. Because SDPs automatically limit authenticated users’ access to narrowly defined network segments, the rest of the network is protected should an authorized identity be compromised by an attacker. "This also offers protection against lateral attacks, since even if an attacker gained access, they would not be able to scan to locate other services," Skipper says.


Assessing the Security Risks of Emerging Tech in Healthcare

How some of these newer technologies are implemented into existing healthcare environments is also a critical security consideration, other experts say. "Smart hospitals have a blend of old technologies and newer innovations, improving the experience for both the patients and the clinicians," says Sri Bharadwaj, chief operating and information officer of Longevity Health Plan and chair-elect of the Association for Executives in Healthcare Information Security, a healthcare CISO professional organization. The key is to realize that legacy technology that is embedded in "newer shiny objects" still has the same security risks that have to be mitigated through strong administrative and technical controls to provide a robust complement to the newer technology, he says. ... "One thing to always keep in mind is that as security leaders our job is to perform due diligence and assess the risk of all services and technologies. We are also to find ways to help mitigate the risk, where possible, and raise the risk awareness to the organization," she says.


7 tell-tale signs of fake agile

When the focus shifts to granular facets of agiles, like Scrum ceremonies, instead of actual content and context, agile’s true principles are lost, says Prashant Kelker, lead partner for digital sourcing and solutions, Americas, at global technology research and advisory firm ISG. Agility is about shipping as well as development. “Developing software using agile methodologies is not really working if one ships only twice a year,” Kelker warns, by way of example. “Agility works through frequent feedback from the market, be it internal or external.” Too often organizations focus on going through the motions without an eye toward achieving business results. Agility is not only about adhering to a methodology or implementing particular technologies; it’s about business goals and value realization. “Insist on key results every six months that are aligned to business goals,” Kelker says. When a team lacks a dedicated product owner and/or Scrum master, it will struggle to implement the consistent agile practices needed to continuously improve and meet predictable delivery goals. CIOs need to ensure they have dedicated team members, and that the product owner and Scrum master thoroughly understand their roles.


Top 10 Microservices Design Principles

Microservices-based applications should have high cohesion and low coupling. The idea behind this concept is that each service should do one thing and do it well, which means that the services should be highly cohesive. These services should also not depend on each other, which means they should have low coupling. The cohesion of a module refers to how closely related its functions are. Having a high level of cohesion implies that functions within a module are inextricably related and can be understood as a whole. Low cohesion suggests that the functions within a module are not closely related and cannot be understood as a set. The higher the cohesion, the better – we may say that the modules are working together. Coupling measures how much knowledge one module has of another. A high level of coupling indicates that many modules know about each other; there is not much encapsulation between modules. The low level of coupling indicates that many modules are encapsulated from one another. When components in an application are loosely coupled, you can test the application easily as well.



Quote for the day:

"To be a good leader, you don't have to know what you're doing; you just have to act like you know what you're doing." -- Jordan Carl Curtis

Daily Tech Digest - October 09, 2019

Blockchain: Why the revolution is still a decade away


According to Adrian Lee, who researched the report, this was caused by a "lack of industry consensus" on key features of the technology, such as product concept, application requirements or target market. In other words, blockchain has been a victim of its own hype. Its potential benefits raised huge expectations, but in reality it is not mature enough yet to be efficiently implemented at scale. Litan compares this to the adoption of the internet: users don't have to worry about understanding protocols such as DNS or TCP/IP. This is why browsing the web is scalable, and it is why it became so mainstream. But if an enterprise wants to implement blockchain, it's a whole different story. Individual companies have to worry about picking a platform, coming up with a smart contract language, or using a specific system interface and consensus algorithms. ... Avivah Litan, research vice-president at Gartner, doesn't see this happening before 2028, which is when she expects the technology to be fully scalable.



ISO 27001. PCI DSS. GDPR. When it comes to business and security standards, it's easy to get lost in the alphabet soup of acronyms. How can you discern which ones are right for your organization? Start by asking some high-level questions as to what you hope to accomplish by adopting them – and how adhering to standards can help your growth, says Khushbu Pratap, a senior principal analyst at Gartner who covers risk and compliance. "The most important questions to ask [are]: Are your customers asking for it, and do your stakeholders think a particular standard is important?" says Pratap. Assuming the answers are yes, there are additional factors to think through before moving ahead with a strategy for compliance. The seven practical tips outlined in this feature will help. Heavily regulated organizations typically have special teams that work on these standards, but even for them, use this list as a chance to take a step back and better target your standards compliance and certification teams.


For writing more secure code, culture remains another challenge. Stu Hirst, principal cloud security engineer at British online food order and delivery service Just Eat, speaking at last week's ScotSoft conference in Edinburgh, Scotland, advocated literally showing developers the risks that poor or poor-quality reused code can create, for example, by showing them how it can be hacked. He says such discussions are essential for fostering a culture in which coders are coding securely, without trying to impose punitive measures. ... Earlier this year, the CISO of a European financial services firm told me that his organization's approach has been to maintain its own repository of code snippets that have been vetted and trusted, from which in-house developers can draw, thus saving time and contributing to more secure and stable software builds. The organization also regularly evaluates open source offerings, and it isn't afraid to tear up code built in-house when a better open source alternative becomes available. 


The Magic Of Smart Mirrors: AI, AR & The IoT

The Magic Of Smart Mirrors: Artificial Intelligence, Augmented Reality And The Internet of Things
Coty’s version of the smart mirror is the CES 2019 Innovation Awards Honoree—Wella Professionals Smart Mirror. This mirror allows stylists to provide more personalized consultations. Like the apps discussed above, the Wella Professionals Smart Mirror is able to do a live AR hair color try on and can provide a 360-degree of the style so the client can see what it will look like from all angles. In addition, using facial recognition technology, it can retrieve past styles for each customer, allowing the stylist and client to really assess what worked and what didn't. ... It also connects to a mobile app so the stylist and customer can stay in contact in between appointments. Memory Mirror, a digital mirror created by MemoMi, combines a full-length mirror with high-tech including a 70-inch LCD, computer and HD camera that can record videos so you can save, share and review your try-on sessions. Neiman Marcus installed MemoMi’s mirrors in 34 locations. Another mirror altering the retail experience is the Oak Mirror by Oak Labs. It serves as a digital assistant in a dressing room, allowing customers to request other colors, styles, or accessories from a sales assistant.


Canada’s Blockchain Sector Wants Legal Clarity


The report – one of the first to take a comprehensive snapshot of Canada’s blockchain ecosystem – sheds new light on the country’s nascent crypto firms, who appear largely bullish on their own future and are increasingly eager to know if their government feels the same. ... Though separate from U.S. regulators and from other global regulatory bodies, Canada’s government has been reticent to establish crypto regulations that might conflict with other countries’ laws, said Michael Gord, CEO of Toronto-based MLG Blockchain consulting group. Instead, Gord described a regulatory gray zone that confounds his consulting group and the legal teams he turns to for advice: “Often digital asset regulations in Canada are so ambiguous that lawyers cannot give us a yes or no answer. The regulations have not been defined enough for them to be able to.” Neither the U.S. nor Canada have developed comprehensive definitions for digital assets, and Gord doubts the Canadians will jump ahead: “Even if [Canadian regulators] were to want to create clear regulation, there’s a lot of pressure from the SEC” to follow its lead, he said.


How to prepare tomorrow’s workforce? Focus less on devices and more on digital thinking

Mastery of technology skills + knowledge.
In most liberal arts institutions, students are situated in a brick-and-mortar, face-forward teaching environment that says, “read this book, do this essay, or submit this paper. In their own personal lives, they are digital natives, using an iPhone and technology to do just about everything – from communicating to ordering food. They must push that world aside, however, to conform to teaching methods and teachers that are not digitally literate. The solution is not just to introduce more digital devices and technical training into a classroom to get faculty and students to think more digitally about what they are doing, but to improve their overall digital literacy or ability to live, work, think and communicate in a society that is driven by the Internet, social media, mobile devices and other digital technologies. In short, change the education and learning formula to be more closely aligned with the demands of today’s digital world.


74% of global workers say the tech industry needs more regulation


Overall, nearly three-quarters (74%) of global workers said the tech industry needs more regulations. Snow surveyed 3,000 professionals across the US, Europe, and the Asia-Pacific region to determine how employees felt about about data privacy regulation standards. As technology enables more organizations to harbor personal consumer data, standards must be put in place to make sure this information isn't exploited. ... Millennials were more likely to feel like their data is protected by regulations (44%) than baby boomers (21%), the report found. Some 55% of tech company vice presidents and 52% of directors also said they feel more protected from data breaches, while only 27% of entry-level employees said the same. The rise in data regulation has resulted in more pop-up and opt-in messages for employees, but opinions are split down the middle whether these messages are disruptive to their workday or not.  "But at the same time, the increase in regulation makes administratively navigating the internet much more difficult, and some might find this to be an annoying and tedious user experience," Larson said.


How the Software-Defined Perimeter Is Redefining Access Control

An SDP or zero-trust model can be used within the modern perimeter-less enterprise to help secure remote, mobile, and cloud users as well as workloads. SDP isn't just about having a secure tunnel — it's about validation and authorization. Instead of just trusting that a tunnel is secure, there are checks to validate posture, robust policies that grant access, segmentation policies to restrict access and multiple control points. The increasing adoption of zero-trust security technologies by organizations of all sizes is an evolving trend. As organizations look to reduce risk and minimize their potential attack surface, having more points of control is often a key goal. Security professionals also typically recommend that organizations minimize the number of privileged users and grant access based on the principle of least privilege. Rather than just simply giving a VPN user full local access, system admins should restrict access based on policy and device authorization, which is a core attribute of the zero-trust model. 


How to build a better cybersecurity defense with deception technologies


Deception technology addresses these key challenges with early and accurate detection coupled with automation to accelerate incident response. The solution tricks threat actors into revealing their presence with authentic, high-interaction decoys that blend seamlessly into the production environment. As soon as an attacker attempts to scan the network, steal credentials, or move laterally, the deception platform raises a high-fidelity alert, reducing dwell times. From there, defenders can remediate or safely let the attack play out and collect company-specific threat intelligence to strengthen their defenses. ... One way to be more proactive is to assume the attacker will get in, and plan a defensive strategy that leverages the entire network to detect them early, while gathering adversary intelligence to better defend against future attacks. In the perimeter-less society that we find ourselves in, with the rapid adoption of cloud infrastructure and ubiquitous global access, traditional security can't scale to keep up with where organizations now operate.


Hype vs reality: Is the tech industry on the cusp of another ‘AI winter’?


The amplification benefits that AI can bring to the IT work that humans are responsible for within organisations was one area called out by Chandrasekaran during the panel as sign of the good that the technology can do. Although a lot of the reporting on AI focuses on how its proliferation within enterprises could lead to job cuts, the converse is often true, he said. “When we [Cisco] look at any IT organisation, they are growing,” he said. “They are hiring hundreds of people to run the network, or the digitisation that’s happening. What we see is that the [AI] tooling is basically to free them up from dealing with the complexity that comes along, so that they can actually get their job done. “We look at all this automation, and… the idea is to free people so that they don’t become completely buried with the burden that’s coming along with the number of devices coming on board.”



Quote for the day:


"Leaders are people who believe so passionately that they can seduce other people into sharing their dream." -- Warren G. Bennis


Daily Tech Digest - August 26, 2019

Samsung Galaxy Note 10 DeX Windows 10
Just because the Galaxy Note 10 Plus isn't the laptop replacement I've been looking for, it could be the primary computing device for workers who spend most of their time either in the field or moving between branch offices. I can easily see salespeople using the S Pen to click through the slides of a client presentation on a Note 10 Plus that's connected to a conference room TV. Regional managers who travel between stores could work directly from their Note 10 Plus provided their company had an external keyboard/mouse/display combo or loaner computer available at each site. And true field workers who rarely need to type on a keyboard during the day (like officers with the Chicago Police Department, which is running a pilot program with Samsung's DeX in Vehicle solution), could definitely use the Note 10 Plus for most tasks, if their companies take the time to ruggedize the phone...at a $1,099 a device you don't want to drop this thing on a factory floor or have it fall off the back of a truck on a construction site.



NASA Astronaut Accused Of Hacking Bank Account From Space

The New York Times report details how Summer Worden, Anne McClain's estranged spouse, put her skills as a former U.S. Air Force intelligence officer to work when she suspected McClain had been accessing her bank account. Having contacted her bank for details of the locations of logins to the account, Worden discovered one of the computers, where her login credentials were used from, was registered to NASA. McClain was aboard the International Space Station at the time, due to be part of the ill-fated all-female spacewalk, and putting two and two together led Worden to the conclusion that she had found her bank account hacker. McClain, who has since returned to Earth following her six months in space, has admitted that she did, indeed, access the account while aboard the International Space Station. The newspaper report stated that, under oath and via a lawyer, McClain insisted she was making sure there were sufficient funds in the account to care appropriately for the child they had been raising together.


Gartner Hype Cycle deems software-defined networking obsolete


The Gartner report is blunt and refreshing. For instance, check out this part: "Don't get caught up in SDN hype and claims that commercial products are 'SDN' or be persuaded that SDN is the answer to all networking problems since clearly this has not transpired." The same could be said for other hyped networking technologies. Instead, Gartner advised, enterprises should focus on solving specific problems within their networks and evaluate networking services based on their ability to deliver operational value. On a positive note, SDN shook up the networking industry by challenging established vendors and affecting subsequent market developments. SDN, for instance, spurred the rising use of white box switches, open source hardware and the development of independent network switch software providers. Fortuitously, for enterprises, traditional networking vendors also shifted their focus to innovate around network operations and management.


The Death of Agile and Beyond

Despite the cry that from the agilists that agile is dead/failing, it remains popular and is becoming increasingly "fashionable" among the senior executives. Surveys by Deloitte and McKinsey show that more than 90% of the executives believe that "becoming agile" is a high priority. And of course, any high priority aspiration often comes with a mandated time-constraint. The first problem with these aspirations is the imposition; they rob people of the opportunity to choose agile as a way of being. However, the bigger problem is that these aspirations are missing a key element: the sense of why. Think of impact mapping for enterprise agility; impact mapping is a way of mapping any goal using four ordered questions why, who, how and what. Why is the most important aspect; in the case of the need to be agile, answering "Why do we aspire to be Agile" properly and keeping these reasons in the forefront of the discussion invites teams into agility instead of imposing it on them. However, in most mandated enterprise agile transformation the conversation focuses on the who, how and what.


Software-defined perimeter – the essence of trust

millennials trust
Today, the IP address is no longer sufficient to define the level of trust for a user. IP addresses lack user knowledge to assign and validate the trust. There is no contextual information taken into consideration. This is often referred to as the IP address conundrum. Therefore, as an anchor for the network location and policy, we need to look beyond the ports and IP addresses. Network policies have traditionally focused on what systems can communicate with each other. The permit or deny is a very binary framework to use in today's dynamic environment. It has resulted in a policy that is either too rigidly defined or too loosely defined. This is where the software-defined perimeter finds the middle-ground. ... The considerable benefit of using an identity provider is that it acts as a gateway for users to authenticate against the same centralized trust. However, VPNs or other gateway services require a different database with a different management process. This can create an overhead to either add or delete the users from different databases. Having everything controlled in one central database provider is the key to managing a single set of controls of trust. Essentially, in SDP, a user validates against an externally facing IDP and then the user is authenticated against the identity store.


Adopting Agile Principles In Health Care


A core tenet of our approach is that for each innovation, Inception Health establishes an Agile team composed of clinicians, engineers, managers, data scientists, and user representatives. Each team establishes an iterative cycle to improve outcomes and the value to patients, to the health professionals, and to the system overall. While the core team comprises a handful of employees, several hundreds of people from member health care systems have participated in these Agile projects. By embedding Agile principles in the integration process of innovation in the member health care systems, Inception Health has been able to integrate innovations and iterate quickly. In the past two years, Inception Health has implemented 26 innovation projects at Froedtert and the Medical College of Wisconsin Health Network, including online tools for behavioral health, diabetes management, patient engagement, campus wayfinding, and remote monitoring. To enable clinicians to prescribe digital applications at the point of care, Inception Health partnered with a company called Xealth to create a digital health formulary, tying in third-party digital health applications with the electronic health record and clinical workflows.


Hacker Claims He Can ‘Turn Off 25,000 Cars’ At The Push Of A Button

Car immobilizers hacked
Ken Munro, cybersecurity researcher and partner at Pen Test Partners, first described the hack to Forbes at the DEF CON convention in Las Vegas. He found that it was possible to turn the immobilizer on and the car off by sending a simple request via a browser. Once he'd entered the command, it took less than a second for the immobilizer to be triggered. It was as if Munro was acting as one of the SmarTrack call center employees who were permitted to turn the immobilizer on. SmarTrack systems just weren't correctly checking that the commands were being sent by an authorized user, Munro said. Munro warned that it would be impossible for anyone to start the car again with the immobilizer fitted. The only option would be to have the tech removed entirely, he added. "We now control the immobiliser, so only we can de-immobilize the car." And, if the hacker turned the immobilizer on when the car is moving, it would simply prevent the car from running as soon as the engine stopped. As Munro noted, that could be "quite nasty" if the car has an auto start and stop function. ... Munro was also critical of Thatcham Research, the industry body which had given accreditation to the SmarTrack devices, saying it was safe to use.


Choosing SIP vs. PRI: What are the differences?

Because SIP trunks are software-centric compared to PRI, they are far more elastic and scalable. Adding or reducing the number of calls a SIP trunk handles usually only takes a change in configuration on both sides of the trunk. The real limitation in the case of a SIP trunk is the bandwidth between trunk endpoints. That leads us to some drawbacks of SIP trunking. For one, many SIP trunk architectures allow a SIP trunk to ride across the same internet link that employees use to surf the internet, stream video and perform other internet-based tasks. This creates a situation where voice traffic riding across the SIP trunk can be negatively affected if there is insufficient bandwidth to handle both the calls traversing the SIP trunk and standard internet traffic. Thus, it's important to watch internet throughput closely so bottlenecks don't occur. While businesses can opt for running SIP trunks directly over the internet, telecommunications providers prefer to offer dedicated data lines directly to a customer's premises to ensure the quality and stability of their SIP trunks.


The end of project management?

clothes pins organize project management sort by ryan mcguire gratisography
As IT moves to more to a product management run organization, what are the impacts? CIOs say that the addition of product management to the mix has two impacts--increased internal customer delight and increased street cred of the CIO. When IT products are appropriate managed via product management, the impacts for the business should be digital products that are useful, usable, and get used. And CIOs suggest this is the case for both internal and external focused products. Here the business gets better aligned tools from a customer experience/user experience perspective. From this process, CIOs get to point to distinct products making an impact on the business. This is especially the case for customer-facing products where financial impact drawn from them. This makes IT more than just a cost-center that the CFO can't understand. From an organizational design perspective, teams should increasingly be based on products, not technical function. As the glue that ties disciplines to product, CIOs see the potential for clarity and transparency coming from product management and a renewed focus on data, analytics, and elevated maturity for CX, business technology, and soft skills.


Cryptography & the Hype Over Quantum Computing

So, what should we be doing now about the potential "quantum threat"? First, the cryptography research community should be focused on post-quantum secure cryptography. The good news is that this effort has been going on for years and is ongoing. The role of this research community is to make sure that we have the cryptography we need in the decades to come, and they are taking the issue seriously. (As a side note, symmetric encryption and message authentication codes are not broken by quantum computers, to the best of our knowledge.) Second, the cryptography research community should start thinking about standardization so that businesses are ready if the quantum threat does prove real. Once again, the good news is that NIST has already begun the process. But all of this is about what the "community" should do. What should you — as someone who uses cryptography to secure your business — do? Let's start with what you shouldn't be doing. You shouldn't buy post-quantum encryption and the like before standardization is complete.



Quote for the day:


"One of the advantages of being disorganized is that one is always having surprising discoveries." -- A.A. Milne


Daily Tech Digest - June 24, 2019

Software Defined Perimeter (SDP): The deployment

sdn
SDP architectures are user-centric; meaning they validate the user and the device before permitting any access. Access policies are created based on user attributes. This is in comparison to the traditional networking systems, which are solely based on the IP addresses that do not consider the details of the user and their devices. Assessing contextual information is a key aspect of SDP. Anything tied to IP is ridiculous as we don’t have a valid hook to hang things on for security policy enforcement. We need to assess more than the IP address. For device and user assessment, we need to look deeper and go beyond IP not only as an anchor for network location but also for trust. Ideally, a viable SDP solution must analyze the user, role, device, location, time, network and application, along with the endpoint security state. Also, by leveraging the elements, such as directory group membership and IAM-assigned attributes and user roles, an organization can define and control access to the network resources. This can be performed in a way that’s meaningful to the business, security, and compliance teams.



Troy Hunt: Why Data Breaches Persist

"Anecdotally, it just feels like we're seeing a massive increase recently," he says. "I do wonder how much of it is due to legislation in various parts of the world around mandatory disclosure as well. Maybe we're just seeing more stuff come to the surface that otherwise may not have been exposed." But the potential for even bigger breaches also continues to rise, he says. "I don't see any good reason why data breaches should be reducing, certainly not in numbers," Hunt says. "I reckon there are a bunch of factors ... that are amplifying certainly the rate of breaches and also the scale of them." Such factors, he says, include the ever-increasing amounts of data being generated by organizations and individuals, the increasing use of the cloud - and the ease of losing control of data in the cloud - as well as the many more internet of things devices being brought into the world. In a video interview at the recent Infosecurity Europe conference, Hunt discusses: Long-term forecasts about data breach quantity and severity; Why breach perpetrators so often continue to be children; and How so much "smart" technology aimed at children continues to be beset by abysmal security.


Explore 4 key areas of enterprise network transformation


The top issue among the IT professionals surveyed was a lack of time to complete business initiative projects -- 43% of respondents said they struggle with this. In addition, 42% of respondents said they struggle to troubleshoot across the network as a whole. These blind spots can impede NetOps, network performance quality and, therefore, network transformation. Overall, a poorly performing network negatively affected business performance as a whole, respondents said. As such, respondents said they would prioritize the following areas of network performance: application performance, remote site performance, and endpoint and wireless performance. These improvements were among the most common goals for networking and IT professionals, according to the study. To support these network transformation goals, 37% of teams said they hope to upgrade their network performance management service. Teams can address several network performance issues with improved end-to-end visibility of their network and more insight into specific network issues. 


The Importance of Metrics to Agile Teams

Many programmes fail simply because teams could not agree or gain buy-in on meaningful sets of metrics or objectives. By its very nature, Agile encourages a myriad of different methodologies and workflows which vary by team and company. However, this does not mean that it’s impossible to agree achieve consensus on metrics for SI.  We believe the trick is to keep metrics simple and deterministic. Complex metrics will not be commonly understood and can be hard to measure consistently, which can lead to distrust. And deterministic metrics are key as improving them will actually deliver a better outcome.  As an example – you may measure Lead Times as an overall proxy of Time to Value, but Lead Time is a measure of the outcome. It’s also important to measure the things that drive/determine Lead Times, levers that teams can actively manage in order to drive improvements in the overarching metric (e.g. determinant metrics like Flow Efficiency). The deterministic metrics we advocate are designed to underpin team SI, in order to steadily improve Agile engineering effectiveness.


Microsoft’s road to multicloud support


An important part of Microsoft’s multicloud strategy is Azure Stack, which is preconfigured hardware to run Azure services that can be deployed locally. However, Kubernetes support on-premise via Azure Stack is behind the support for Kubernetes on the public Azure cloud. “We have Kubernetes on Azure Stack through a project called AKS Engine, which is in preview now,” says Monroy. He claims that AKS Engine will be generally available “soon”, adding: “We have a lot of customers who are using this today.” Serverless containers offer developers a way to achieve multicloud portability. In the Microsoft world, Azure AKS virtual nodes can be deployed to run workloads in Azure Container instances. “There is no lock-in, nothing Azure-specific – you just annotate your workloads and say ‘I want to opt in to this scaling capability’ and we’re able to provide per-second billing,” says Monroy. “If you take that same workload and you run it on a different cloud, it’s going to run.” But AKS virtual nodes are not yet available for Azure in the UK – although they are available elsewhere in Europe.


Data Governance and Data Architecture: There is No Silver Bullet


Having tools and technology facilitates the process of understanding the data, where it’s stored, how it’s organized, what the processes are, and how it’s all tied together, “but it’s not the ‘easy’ button that does everything for you.” Some companies have been trying to rely on metadata repositories alone, but the real key, he said, is in modeling. “A picture’s worth a thousand words, right?” Having the metadata and being able to do analytics and queries is helpful, but without pictures that explain how all the elements are related, and understanding the data lineage and life cycle, “You don’t have a chance.” Keeping higher-level business goals in mind is essential, but implementation should be focused on the fundamentals. “Metadata is a big piece of that too. A lot of the metadata is focused up at that higher level. Are your metadata management tools really getting down to the lower level?” Data and process modeling in particular are more important now than they’ve ever been before, he said, but that modeling should be coupled with the reverse engineering capabilities and all the tools and processes needed to do proper governance.


Disposable Technology: A Concept Whose Time Has Come


Modern digital companies like Google, Facebook, Twitter, Apple, Netflix, Amazon, and AirBnB have taken a technology architecture approach that increasingly treats the technology infrastructure as “disposable” using open source technologies. And the reason for this open approach, in my humble opinion, is two-fold: Firstly, building upon open source technologies provides the flexibility, agility and mobility for companies to move to the next best technology without the constraints  ... Modern digital companies are basing their technology infrastructure on open source technologies that not only prevents vendor architectural lock-in but also allows them to advance the technology capabilities at their pace and at the pace of the business; and Secondly and more importantly, these digital companies understand that the technology isn’t the source of business value and differentiation. They understand that the source of business value and differentiation is: the data that these organizations are masterfully amassing via every customer engagement and every usage of the product or service; and the customer, product and operational insights that leads to new Intellectual Property (IP) monetization and commercialization opportunities.


Blue Prism acquires UK’s Thoughtonomy to expand its RPA platform with more AI

Inside the first museum show by DARPA, the ‘Pentagon’s brain’
Robotic process automation — which lets organizations shift repetitive back-office tasks to machines to complete — has been a hot area of growth in the world of enterprise IT, and now one of the companies that’s making waves in the area has acquired a smaller startup to continue extending its capabilities. Blue Prism, which helped coin the term RPA when it was founded back in 2001, has announced that it is buying Thoughtonomy, which has built a cloud-based AI engine that delivers RPA-based solutions on a SaaS framework. Blue Prism is publicly traded on the London Stock Exchange — where its market cap is around £1.3 billion ($1.6 billion), and in a statement to the market alongside its half-year earnings, it said it would be paying up to £80 million ($100 million) for the firm. The deal is coming in a combination of cash and stock: £12.5 million payable on completion of the deal, £23 million in shares payable on completion of the deal, up to £20 million payable a year after the deal closes, up to £4.5 million in cash after 18 months, and a final £20 million on the second anniversary of the deal closing, in shares.


Codes Tell the Story: A Fruitful Supply Chain Flourishes


Sharing anecdotes from the process, McMillan gave the audience several practical tips. She noted that Usage and Procedure Logging (UPL) provided invaluable insights for the migration. “This tells you not only which objects you’re touching, but also which business processes they’re calling: Warehouse or inventory management? We used this to figure out what’s really being used,” with respect to custom coding. She said the results were very promising: “What we found out, in production, is that almost 60% of the custom code developed in the last 5-10 years, was not being used! I can’t tell you how many of those custom scripts were used just once, and never touched again.” This was fantastic news, because custom code can cause serious headaches when undergoing a migration of this magnitude. Every last bit of custom code needs to be vetted, which can be very time-consuming, and error-prone. “Some of the most tedious parts were really challenging,” McMillan said, “having to go through object by object took a lot of time; certain tables that SAP made obsolete; fields that the type has changed. When you do the immigration, you can’t code the same way used to code.”


Obscuring Complexity

How can MDSD obscure the complexity of your application code? It is tricky but it can be done. The generator outputs the code that implements the API resources, so the developers don't have to worry about coding that. However, if you use the generator as a one-time code wizard and commit the output to your version-controlled source code repository (e.g. git), then all you did was save some initial coding time. You didn't really hide anything, since the developers will have to study and maintain the generated code. To truly obscure the complexity of this code, you have to commit the model into your version-controlled source code repository, but not the generated source code. You need to generate that output source from the model every time you build the code. You will need to add that generator step to all your build pipelines. Maven users will want to configure the swagger-codegen-maven-plugin in their pom file. That plugin is a module in the swagger-codegen project. What if you do have to make changes to the generated source code? That is why you will have to assume ownership of the templates and also commit them to your version-controlled source code repository.



Quote for the day:


"Do not compromise yourself. You are all you have got." -- Janis Joplin


Daily Tech Digest - June 07, 2019

Autonomous versus automated: What each means and why it matters

Futuristic technology of self-driving car.
Automated systems work best in well-defined environments with clear functions to perform. These systems can be built efficiently, and operate much faster than a human. One area, specific to security, that comes to mind is in validating an infrastructure template. As infrastructure increasingly becomes software defined, a CI/CD like process is needed to validate the configurations. This can be viewed as a pre-deployment compliance check to make sure the infrastructure is provisioned correctly and that human errors are caught. Autonomous systems are most effective in an ever-evolving landscape such as new attack vectors and increased attack surfaces. These systems need access to datasets from which to learn from and new algorithms to analyze the data differently as the AI space matures. These systems come at a cost, however, as many are heavily focused on R&D with increasing investments made over time. Due to the increased cost and complexity, these systems are overkill for solving solutions that are just as easily addressed by automation based systems. Over time, autonomous systems will require less training data, and the complexity is already being reduced by a combination of open source projects and cloud provider offerings, but they will continue to be more complex and expensive relative to automated systems.



Making the most of micro-moments with Dr. Shamsi Iqbal

The word “distraction” has a negative connotation to it and I want to look at it differently because sometimes you do need to step away from work and you do need to take breaks and you do need to just refresh your perspectives and I believe that that actually makes you more productive in the long run. So, I think that the problem is deeper here. So, we need to take breaks. We need to do other stuff, but we have difficulties in prioritizing what is important for us, what we need to get done, what moves us forward in the responsibilities that we have. And we often get lost. And I think that’s where technology can help us. I mean, if I’m not able to help myself because I am just distractible and when I go down that rat hole of distractions, then maybe yes, I do need something that pulls me back out. And so, that’s how we’re coming at this problem because I personally don’t feel that if you take a break and you go and chat with a colleague about mundane things, or if I go on Facebook or Twitter, unless I’m spending hours on it, I don’t see that to be a problem.


Legacy IT systems a significant security challenge


As legacy IT systems age, said Ford, the security risks increase, compounded by the fact that many of these systems are critical to the business and often cannot be decommissioned or replaced because of high costs, complexity or lack of suitable alternatives. “Legacy IT systems are often at the heart of cyber breach incidents, and because decommissioning is not usually an option, information security professionals need to manage the risk by working closely with key business stakeholders to identify all critical systems and the systems that support them,” he said.  The next step, said Ford, is to understand which are the most critical systems. “The role of security professionals is to assess the likelihood and potential impact of a cyber attack, while the role of business [professionals] is to identify what systems and processes are the most critical,” he said. Once security professionals understand what systems are critical, Ford said they would be able to prioritise and plan which ones to update and patch to make them secure. “This should be the objective of all information security professionals as business risk managers.”


Instagram's ecommerce move reveals retailers need blockchain to keep up

Instagram's ecommerce move reveals retailers need blockchain to keep up
Believe it or not, many of the very retailers who promote themselves on Instagram as the latest viral craze still use pen and paper for their internal logistics systems. The reason is simple: instead of modernizing to keep up with consumer trends and technological advancements, suppliers tend to stick with what they know. This results in disastrous outcomes for consumers who purchases get lost in shipping frenzies, particularly around the holidays. For example, in 2014, the U.S. Postal Service reported that about 88 million undeliverable items were directed to the USPS Mail Recovery Center in Atlanta, Georgia. Of those tens of millions of items, only about 3% ended up in the correct customer's hands – the rest either got destroyed, donated or auctioned off. The most frustrating part of this current cycle of mismanagement is that real solutions already exist to help companies improve successful rates of delivery. By incorporating blockchain technology into the shipment process, retailers can create a fully integrated and streamlined system across their entire supply chain.


Cloud Hadoop Competition Hits MapR, Cloudera

Image: echiechi - stock.adobe.com
"MapR has formidable competition on premises from a much larger Cloudera now, and faces increased pressure from cloud providers offering their own Hadoop-based solutions. Their proprietary versions of open source components now appear more risky as a result, and lead to more questions about their suitability for long term plans," Adrian told InformationWeek this week. "Gartner has talked to a number of concerned [MapR] customers, some quite large, who believe in the technology, and some made additional investments during the past year, but the outlook is not encouraging." Among the company's missteps was the transition from direct sales to an indirect model, which is tricky when you are dealing with complicated technology sales to enterprise-sized companies, according to Adrian. In spite of its own difficulties, Cloudera may be positioned to take advantage of MapR's troubles. Cloudera's Reilly said that the merger with Hortonworks has enabled the company to get more resources and scale to develop cloud architecture to "quickly re-platform our business. MapR could not get the resources or scale. Their customer base is an opportunity for us and part of our growing pipeline."


Juniper: Security could help drive interest in SDN


Juniper’s study found that 87 percent of businesses are still doing most or some of their network management at the device level. What all of this shows is that customers are obviously interested in SDN but are still grappling with the best ways to get there, Bushong said. The Juniper study also found users interested in SDN because of the potential for a security boost.  SDN can empowers a variety of security benefits. A customer can split up a network connection between an end user and the data center and have different security settings for the various types of network traffic. A network could have one public-facing, low-security network that does not touch any sensitive information. Another segment could have much more fine-grained remote-access control with software-based firewall and encryption policies on it, which allow sensitive data to traverse over it. SDN users can roll out security policies across the network from the data center to the edge much more rapidly than traditional network environments. 


img-20190604-184848.jpg
"This method was inherently biased," he said, and "failed to captured niche interests like mushroom picking." That led to the creation of Amazon's first recommendation engine. Wilke outlined the technical details of the matrix-based completion methods that Amazon tested, which eventually led to its first commercial deep learning model. Throughout, "We didn't sequester our scientists," he said. Instead, data scientists were integrated into teams focused on the product and customer experience. "They start with the customer experience, not the machine learning algorithm," he said. Similarly, for the development of its in-store shopping experience, Amazon Go VP Dilip Kumar stressed, "If you start with a genuine customer problem, you can use the power of machine learning... to build a stellar customer experience." To create the concept of the Amazon Go store -- "take what you want and just go," according to Kumar -- Amazon had to choose technologies to eliminate the checkout process. It settled on computer vision. The first problem to solve, he said, was identifying the customer account and their precise location in the store. Amazon utilized geometry and deep learning to not just predict customer account location but accurately associate interactions to the right customer account.


Nearly two-thirds of businesses hit by credential abuse


“Both internal employees and third-party vendors need privileged access to be able to do their jobs effectively, but need this access granted in a way that doesn’t compromise security or impede productivity,” said Morey Haber, CTO and CISO of BeyondTrust. “In the face of growing threats, there has never been a greater need to implement organisation-wide strategies and systems to manage and control privileged access in a way that fits the needs of the user.” Globally, the businesses surveyed reported an average of 182 third-party suppliers logging in to their systems every week. In UK organisations, 46% said they have more than 100 suppliers logging in regularly, underlining the scope of risk exposure. The UK data shows that businesses still tend to be too trusting, with 83% admitting they trust third-party suppliers accessing their networks, slightly up from last year’s report. However, trust in employee privileged access was cited at 87%, down from 91% a year ago.


Cloud adoption drives the evolution of application delivery controllers

Cloud adoption drives the evolution of application delivery controllers
This begs the question as to what features ADC buyers want for a cloud environment versus traditional ones. The survey asked specifically what features would be most appealing in future purchases, and the top response was automation, followed by central management, application analytics, on-demand scaling (which is a form of automation), and visibility.  The desire to automate was a positive sign for the evolution of buyer mindset. Just a few years ago, the mere mention of automation would have sent IT pros into a panic. The reality is that IT can’t operate effectively without automation, and technology professionals are starting to understand that. The reason automation is needed is that manual changes are holding businesses back. The survey asked how the speed of ADC changes impacts the speed at which applications are rolled out, and a whopping 60% said it creates significant or minor delays. In an era of DevOps and continuous innovation, multiple minor delays create a drag on the business and can cause it to fall behind is more agile competitors.


Why Should We Care About Technology Ethics? The Updated ACM Code of Ethics

The original purpose of business is to serve society. If you don't serve society it’s less likely that someone will buy your product. And these days there's a been huge push from society towards requiring more ethical business practices. We've also seen pushback from employees within several well-known large companies when it comes to ethical issues, so there’s internal as well as external push for more ethical technologies. We're seeing these sorts of demands for more environmental considerations, more sustainability considerations, and more concern for the societal impact of technologies, too. People are worried about their data, they're worried about their privacy, they're worried about their kids, they're worried about all kinds of ethical issues that impact them. The fact that a lot of these companies have been able to operate in a relatively grey area for so long has meant that we've actually seen where these cases can go. There's now demand for governments to regulate more heavily, as can be seen with the GDPR.



Quote for the day:


"Leadership, on the other hand, is about creating change you believe in." -- Seth Godin