Daily Tech Digest - September 09, 2021

How a National Digital Twin could help catapult sustainability in the UK

Digital twins continue to remain an area that is underfunded and underdeveloped in the UK. This is largely due to an awareness issue. Until recently, digital twins have largely sat in the remit of academia and therefore much of the theory hasn’t turned into action. Any innovation that has been brought to the table has mainly remained siloed between organisations and sectors. To counter this requires strong, central guidance on what can be achieved through digital twins. The Government is primed to take on this leading role, particularly the Department for Business, Energy & Industrial Strategy (BEIS). In an ideal scenario, we’d see it set up small scrum teams of digital twin experts to support, educate and consult organisations across the private and public sectors to first, develop business cases and proof of value, and second get them to a place where they can develop their own information management strategy to support the digital twin. This cohesive education will help to underpin a National Digital Twin strategy. Hand-in-hand with the awareness issue, is a lack of digital maturity and understanding on how to get to that point. 


Technical Debt Isn't Technical: What Companies Can Do to Reduce Technical Debt

The biggest problem is that unlike a dirty kitchen, technical debt is mostly invisible to our non-technical stakeholders. They can only see the slowing down effect it has, but when they do, it’s often already too late. It’s all about new features, constantly adding new code on already fragile foundations. Another problem is that too much tech debt causes engineering teams to be in fire-fighting mode. Tech debt impacts the whole company, but for engineers, more tech debt means more bugs, more performance issues, more downtime, slow delivery, lack of predictability in sprints, and therefore less time spent building cool stuff. ... Controlling technical debt is a prerequisite to delivering value regularly, just like an organized and clean kitchen is a prerequisite to delivering delicious food regularly. That doesn’t mean you shouldn’t have technical debt. You will always have some mess and that’s healthy too. The goal isn’t to have zero mess; the goal is to get rid of the mess that slows you down and prevents you from running a great kitchen.


When a scammer calls: 3 strategies to protect customers from call spoofing

Humans are invariably going to be the weakest link in the chain; not even the most robust technology can prevent a victim from unwittingly handing over their private credentials. That said, while many financial institutions are investing in educational programs to teach their customers basic principles around protecting their accounts, they need to make it a continuous and ongoing initiative. Likewise, these efforts should extend to the customer-facing workers and especially contact center employees who are ultimately responsible for authenticating a customer’s identity. ... Phone-based scams almost always culminate with the victim transmitting funds, buying untraceable gift cards, or sharing critical data that can be used to create synthetic identities to open new accounts. For financial institutions this means that they need to be able to establish a behavioral baseline of their customers to understand normal interactions from anomalous activities that could be earmarks for potential fraud threats.


Agile Enterprise Architecture Framework: Enabler for Enterprise Agility

The Agile EA Framework (AEAF) helps in breaking barriers between IT and business, ideally with increasing levels of co-location by unit and with fast forming teams that coalesce for new projects. The initial goal of the architect is to bring out a Minimum Viable Product (MVP), improve upon it, and evolve with each iteration. It would also consider the real time customer feedback while adding more features through the iterations. The overall idea is to adopt just enough architecture that would be sufficiently good to deliver the MVP and thus avoiding any big upfront designs. The AEAF helps in defining an architecture using an iterative life cycle, allowing the architectural design to evolve gradually as the problem and the constraints better understood. The architecture and the gradual building of the system must go hand in hand and the subsequent iterations address the architecture issues and address architecture decisions to arrive a flexible architecture. The following diagram depicts the AEAF framework and constituent steps associated with it.


6 Hobbies You Should Have if You’re Interested in Cybersecurity

Ethical hacking (or "white-hat hacking") occurs when people get permission to try and break into a company’s systems. They then report their methods and how quickly they accomplished the task. Ethical hackers would ideally find problems before malicious parties do, giving companies time to act. Some people specializing in ethical hacking recommend having a wide but shallow knowledge pool. This equips them to find issues in cloud software, and so identify vulnerabilities that help malware flourish. ... Hack the Box is a platform for cybersecurity enthusiasts that combines hacking with gamification. The online modules cater to individuals, universities, and companies, providing content to help people hone their penetration testing skills. Think of Hack the Box as a springboard for people interested in hacking who aren’t sure where to start. Besides offering an educational component, there’s a community aspect. For example, people can discuss their methods and get recommendations for different techniques to apply in the future.


SEC Warns of Fraudulent Cryptocurrency Schemes

Several security and blockchain experts draw a direct line between this fraudulent activity and increasingly sophisticated social engineering attempts, or blatantly false advertising that may lead to poor or unsafe crypto investments. James McQuiggan, education director for the Florida Cyber Alliance and security awareness advocate for the firm KnowBe4, says, "Cybercriminals will always find emotional lures to exploit users through social engineering. Asking yourself the question, 'Is this too good to be true?' is the first step to determine if the organization is worthwhile." Further, Julio Barragan, director of cryptocurrency intelligence at the firm CipherTrace, warns against ongoing scams in which victims are lured by a convincing fraudster sending them direct messages on social media or through a friend's hacked account, promoting massive gains. Neil Jones, cybersecurity evangelist for the firm Egnyte says: "Significant change [in the space] will only occur when cryptocurrency platforms become subject to the same standardized IT requirements as traditional investment platforms ..."


Are you stuck in a “logic box”?

The point of the logic box is to help develop self-awareness, an essential skill of leadership that is becoming more important as we negotiate our VUCA—volatile, uncertain, complex, and ambiguous—world. Leaders and their subordinates must always examine the basic premises of a key decision and interrogate its surface validity. This came up in a recent conversation I had with Dambisa Moyo, a widely published economist who is a board member at Chevron and 3M. One of the most important qualities she looks for when assessing leaders is their ability to use different mental models for analyzing choices, an idea that she attributed to Buffett’s partner at Berkshire Hathaway, Charlie Munger. “It’s this idea of road-testing their thinking using different paradigms,” she said. “So, if, say, an investment looks quite attractive from a financial perspective, it might look less attractive through a geopolitical or environmental lens. Given the world that we live in now, people who think about complex problems in a more versatile way have an advantage.”


Protecting your company from fourth-party risk

Since fourth parties are not generally obligated to share information with partners of their clients, organizations are now adapting their TPRM programs to address fourth-party concerns. Fortunately, there are steps companies can take to give them greater visibility into – and protection from – downstream risk. Despite growing awareness of the threat of fourth-party risk, clear guidelines, and uniform processes for fourth parties have not been established, resulting in disjointed, ad-hoc processes. Most of these processes are manual, requiring significant investment in time and labor, and opening the possibility of error and oversight. ... The first step is for companies to understand how their third parties are monitoring their vendors. This includes direct monitoring (i.e., what are they doing to monitor their third parties) and general vendor management (i.e., do they have their own vendor management program and how effective is it). Companies can ask these questions through periodic performance reviews as well as through their annual risk and due diligence reassessments. 


Putting people at the heart of digital marketing

A strong marketing team is made up of people with a diverse range of skills – from strategists and data analysts to identify strengths and map trends and focus plans, to creatives and ‘doers’ to design and deliver beautifully tailored campaigns. A good marketer needs to understand how technology can help to enhance, personalise and deliver these campaigns through the appropriate channels – but also to be able to think beyond the barriers of what technology can provide. Technology makes it easy to execute, analyse and measure a marketing strategy with the push of a button and while this is helpful – especially at scale – where we see the most effective personalised marketing is in teams with marketers who are not afraid to ask questions. They need to be able to query the ‘why’, ‘how’ and ‘who’ behind every marketing decision – whether technology or human driven – to ensure it is relevant, beneficial and being delivered to the right people in the best possible way. Good marketers know this and understand that if we want customers to continue to agree to share their data, we need to earn their trust.


How to Enable Team Learning and Boost Performance

Very often, a team with a performance problem lacks the knowledge of strategy. They do not feel like doing meaningful work. As a leader, you should have defined a framework within which you regularly communicate goals and connect them with strategy. You also need to be open to collect feedback from your team if they feel the goals are achievable or not. It might be that you have clear goals, but you communicate them once per year. Unfortunately, that might be too rare. Based on your context, you need to define the best cadence to remind the team and yourself about the goals. For teams that are working in compex fast changing environment you need to review the goals at least once per 3 months, maybe even more often. For example, you can schedule release planning or delivery planning sessions with your team. Once per 3 months, review with your team the delivery roadmap, release plans. Compare it with your team's current velocity and capacity. Discuss the expectations, collect feedback from your team. Afterwards use sprint review sessions and sprint planning sessions to track the progress towards the goal. 



Quote for the day:

"A positive attitude will not solve all your problems. But it will annoy enough people to make it worth the effort" -- Herm Albright

Daily Tech Digest - September 07, 2021

Tech jobs are changing. But don't expect a boom in IT salaries just yet

While companies may not be planning large wage incentives for staff, Robert Half found that many were readdressing the benefits packages they offer, with the inclusion of perks such as flexible hours, remote-working options and allowances for home office equipment. Clamp suggests that this focus on the employee experience, rather than substantial pay increases, is what's likely to shape compensation packages in the months ahead. "We think it's part of the employee proposition, and part of the experience that is now pretty common among larger employers, and perhaps smaller ones too -- giving people fulfilment of their work," he says. Meerah Rajavel, CIO at Citrix, agrees. "When it comes to attracting and retaining talent, companies need to look beyond pay," Rajavel tells ZDNet. "Benefits programs should focus on total rewards that support employees in a holistic way, providing not only for their financial security, but their physical, intellectual, social, and environmental well-being." Rajavel points out that pay has always been at a premium in the tech space, but adds that the speed at which the market is currently moving is putting pressure on companies to up the ante.


Becoming a Cybersecurity or Privacy Lawyer: Tips for Young Attorneys

A keen interest in technology is helpful, however, as lawyers in this space need to stay abreast of rapid developments in both the law and the underlying space. And taking some classes in IT can be useful to develop a functional tech vocabulary, as you may often find yourself tasked with translating between IT professionals and business leaders within your client’s organizations. If you are already a practicing lawyer, seek out relevant CLE content from the Pennsylvania Bar Association, Practicing Law Institute, Privacy + Security Forum, or other provider; these providers offer annual seminars that provide valuable crossover between tech and legal content. ... “The cyber field is always evolving, from risk vectors, to newly enacted laws (or courts’ interpretation of them), to techniques employed by threat actors. Privacy also is in a state of continual change and updates. Collaboration and dialogue with your peers is an important component of the practice, and the Committee offers an opportunity for young lawyers to do just that,” says Joshua Mooney


Big Banks Benefiting Most From COVID-19 Digital Shifts

One challenge that smaller financial institutions face is that they have older customer bases, which impacts the penetration of digital banking solutions. But there is more than just an age differential. Even taking age out of the equation the largest banks outperform smaller institutions. For instance, midsize banks were found to lag in several digital product usage metrics, such as: Paying bills via online and mobile; Internal funds transfers via mobile app; Using P2P payments in the mobile app; and Receiving alerts via mobile app. Of greater concern is that consumers who do use either online banking or mobile banking are less satisfied with both the design and functionality of the websites and mobile tools. They also report lower satisfaction with the range of services that can be performed with the mobile apps. Beyond redesigning the online banking website or mobile banking app, organizations should focus on the lowest-hanging fruit for increased engagement. This would include linking P2P payments to one of the many available services.


Your hybrid cloud model is just a phase

Hybrid cloud, however, is not a long-term solution. It forms part of a pathway towards a reality in which the public and private sectors alike will use a fully integrated public cloud such as international providers like Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform or public sovereign cloud providers, which provides a broad set of infrastructure services, such as computing power, storage options, networking, and databases, delivered on-demand. The need for this is more important than ever before, with challenges including governance, data and security threats rapidly rising as key focus areas that organisational personnel and the public need to be educated about. This transitional phase should last between five to ten years. As this process takes place, there is likely to be resistance from those with lingering concerns – such as the governance issue I noted above. Alleviating these concerns will mean zeroing in on the things that will permit organisations and public sector entities to evolve in the way they want.


Urban mining: the hidden value of e-waste

“E-waste is the world’s fastest growing waste stream,” said Fred White, commercial manager at Argo Natural Resources. “Just looking at the market size, it’s quite significant, and the rate of growth is enormous – it’s projected to grow by 40% over the next 10 years. A lot of recycling capacity needs to come online to deal with that growth. “We see it as a big opportunity. Global demand for electronic goods is soaring – how many phones and laptops do you have today, compared to 10-15 years ago? And how long do you keep those phones?” Argo is commercialising Deep Eutectic Solvents (DES), a chemistry that has been under research and development at the University of Leicester for nearly 20 years. DES consists of non-toxic, environmentally benign and chemically stable ionic liquids that can be used to extract a wide range of metals. “DES is a platform chemistry of millions of different combinations of salts and simple organic compounds,” White explained. “They can be combined in certain ways to do a wide variety of things.”


The IOT Technologies Making Industry 4.0 Real

IoT devices need internet connectivity to work. However, even the strongest network is bound to experience overload at some point. No matter how sophisticated technology gets, constantly being connected to a network is a fundamental weakness, especially on an industrial scale. More companies these days favor IoT devices that use intermittent connectivity protocols, as opposed to constant wifi or cellular connections, as a way of overcoming this challenge. The logistics industry provides a great case study for the positives of intermittent connectivity. Traditionally, data logger devices that connect using radio-frequency identification (RFID) transmitters or even USB cables have been used to collect condition and location information on stored and shipped materials. But plugging in all those loggers intermittently is extremely labor intensive, and RFID syncs with unreliable towers that are dependent on expensive proprietary systems. Finnish firm Logmore's dynamic e-ink QR code solution is an example of how to use intermittent connectivity at scale. IoT sensors attached to the tags collect information, which refreshes a QR code on a small display.


IoT Attacks Skyrocket, Doubling in 6 Months

With millions still working from home, cybercriminals are targeting corporate resources via home networks and in-home smart devices too, according to Red Canary’s Grant Oviatt. They know organizations haven’t quite gotten used to the new perimeter — or lack thereof. “Throughout the past 12 months, the lack of [incident] preparedness has become increasingly evident, especially with the influx of personal devices logging onto corporate networks, the resulting reduced endpoint visibility, expanded attack surface and surge in attack vectors,” he said in a recent Infosec Insider column for Threatpost. In real-world attacks, the end result of attacks on IoT gear is evolving, Kaspersky found: Infected devices being used to steal personal or corporate data as mentioned, and mine cryptocurrencies, on top of traditional DDoS attacks in which the devices are added to a botnet. For instance, the Lemon Duck botnet targets victims’ computer resources to mine the Monero virtual currency, and it has self-propagating capabilities and a modular framework that allows it to infect additional systems to become part of the botnet too.


Adoption of Cloud Native Architecture, Part 3: Service Orchestration and Service Mesh

All applications and services include all the non-functional code inside them. There are plenty of disadvantages with this type of design. There is a lot of duplicate implementation and proliferation of the same functionality in each application and service, resulting in longer application development (time to market) and exponentially higher maintenance costs. With all these common functions embedded inside each app and service, all are tightly coupled with specific technologies and frameworks used for each of those functions, for example for Spring Cloud Gateway and Zipkin or Jaeger for routing and tracing respectively. Any upgrades to underlying technologies will require every application and service to be modified, rebuilt, and redeployed, causing downtime and outages for users. Because of these challenges, distributed systems are becoming complex. These applications need to be redesigned and refactored to avoid siloed development and the proliferation of one-off solutions.


How tech is a vital weapon against cyber information warfare

Using data ethically and securely is critically important in a digital age, where growing amounts are being created every day. Doing so is no longer just an optional extra, but a human right all of its own. But too many businesses still have a lax approach to data security, and it’s inadvertently aiding cyber criminal efforts. The long list of fines handed out by the ICO is testament to the fact there isn’t enough being done to protect citizens. While reputational damage and fines can be big deterrents, data breaches are still a regular occurrence. Data protection’s plight relies on businesses taking a proactive stance on this, but once again, technology can step in here and play an important enabling role. Irrespective of your business size, you need to look for modern data protection solutions that factor in data security, compliance and customer privacy requirements from the very start. Read customer testimonials, conduct your own research and look to respected awards bodies to help in that decision, rather than just relying on a vendor’s word that their solutions are secure.


Tailoring SD-WAN to fit your needs

Most SD-WANs simply look at packet types or maybe TCP/UDP port numbers, which assumes that all voice packets or all packets for a particular application have the same priority. In many cases, users prioritize specific worker-to-application relationships, not all users of a given application, so prioritization may offer less value than you think. If you have specific reasons for selecting an SD-WAN that has higher header overhead or one that can’t prioritize as you’d like, you can reduce the impact of both these issues by using access links with higher bandwidth if they’re available. If not, and you need to use access bandwidth efficiently, then take the time to assess your vendor options in light of the overhead and prioritization issues. That also goes for security. If an SD-WAN can recognize specific worker-to-application relationships, it can not only prioritize the important ones, but also recognize which of all the possible worker-to-application relationships are actually permitted. That means that the SD-WAN can actually create better security.



Quote for the day:

"The leadership team is the most important asset of the company and can be its worst liability" -- Med Jones

Daily Tech Digest - September 06, 2021

We are in an age of rapid technological progress. But many are not ‘progressing’

Even risk-averse companies that readily adapt and invest in new technologies and processes encounter hurdles. One example is what is known as the ‘productivity paradox‘, which is when anticipated gains in productivity and ROI are not fully realized straightaway. When Apple, Microsoft and Dell Computer arrived on the scene in the 1980s, computer usage was limited to early adopters or those who could afford a personal computer. They did not receive widespread consumer acceptance until the mid-1990s; now, computers and smart devices are an indispensable part of society. The benefits of the computer age are difficult to gauge in simple fashion. MIT’s Nobel Prize-winning economist Robert Solow stated during the internet boom of the 1990s: “You could see the computer age everywhere but in the productivity statistics.” Why? One explanation is that GDP is an imperfect measure for capturing meaningful data and translating technology’s impact on productivity, sustainability and overall well-being. The same can be said of the Gini coefficient used to measure income distribution and economic inequality among a huge swath of the population.


Zero-Trust Model Gains Luster Following Azure Security Flaw

In light of this coming tsunami, enterprises need to rethink their security strategies to embrace zero-trust and identity-based authentication. Both of those strategies are ones that experts recommend for dealing with risks like those posed by the ChaosDB vulnerability. And they will help prepare enterprises for future problems of the same kind, where much of the underlying architecture and processes are out of their control. "The cloud provider can become a single point of failure," said Dan Petro, lead researcher at security testing firm Bishop Fox. And as the industry moves even further toward serverless infrastructure, vulnerabilities like ChaosDB are likely to increase in occurrence and severity, he told Data Center Knowledge. "Anytime we have these highly visible, high-profile weaknesses, attackers are going to notice that, and it's going to inspire similar attacks, similar offensive research," said Mark Orlando, co-founder and CEO at Bionic Cyber; security operations instructor at the SANS Institute; and former security team manager at the Pentagon, the White House and the Department of Energy.


The common vulnerabilities leaving industrial systems open to attack

According to the research, industrial systems are especially open to attack when there’s a low level of protection around an external network perimeter that is accessible from the internet. Device misconfigurations and flaws in network segmentation and traffic filtering are also leaving the industrial sector particularly vulnerable. Lastly, the report also cites the use of outdated software and dictionary passwords as risky vulnerabilities. To uncover these insights, the researchers set out to actually imitate hackers and see what path they’d take to gain access. “When analyzing the security of companies’ infrastructure, Positive Technologies experts look for vulnerabilities and demonstrate the feasibility of attacks by simulating the actions of real hackers,” reads the report. “In our experience, most industrial companies have a very low level of protection against attacks.” Once inside the internal network, Positive Technologies found that attackers can obtain user credentials and full control over the infrastructure in 100% of cases. 


8 must-ask security analyst interview questions

For those who excel in cybersecurity, their interest in the topic is not a 9-to-5 thing; it’s a passion that pervades their everyday lives. To find out if that’s the case, Lindemoen likes to ask about the candidates’ home network setup. “I look for whether they’re using WPA2 vs. WPA and WEP and whether they set up a separate network for when guests use their home wireless network,” he says. “They’re simple things, but it provides some insight into how they think about security in their personal lives.” Lindemoen also asks about which cybersecurity conferences they’d most like to attend if they could, and why. Rather than naming a well-known conference, “they might mention one that’s in a niche they’re focused on or are truly passionate about.” Participation in capture-the-flag (CTF) and other cyber calisthenics events and activities is another good barometer, Glavach says. Because these programs are free, they can be even better about revealing passion than costly certifications are. “If there’s a candidate with no certifications but they participated in CTFs similar to a DEFCON CTF or a SANS Holiday Hack, that shows me they’re very committed,” he says.


10 Most Practical Data Science Skills You Should Know in 2022

It’s one thing to build a visually stunning dashboard or an intricate model with over 95% accuracy. BUT if you can’t communicate the value of your projects to others, you won’t get the recognition that you deserve, and ultimately, you won’t be as successful in your career as you should. Storytelling refers to “how” you communicate your insights and models. Conceptually, if you were to think about a picture book, the insights/models are the pictures and the “storytelling” refers to the narrative that connects all of the pictures. Storytelling and communication are severely undervalued skills in the tech world. From what I’ve seen in my career, this skill is what separates juniors from seniors and managers. ... A/B testing is a form of experimentation where you compare two different groups to see which performs better based on a given metric. A/B testing is arguably the most practical and widely-used statistical concept in the corporate world. Why? A/B testing allows you to compound 100s or 1000s of small improvements, resulting in significant changes and improvements over time.


How To Address Bias-Variance Tradeoff in Machine Learning

Bias and variance are inversely connected and It is nearly impossible practically to have an ML model with a low bias and a low variance. When we modify the ML algorithm to better fit a given data set, it will in turn lead to low bias but will increase the variance. This way, the model will fit with the data set while increasing the chances of inaccurate predictions. The same applies while creating a low variance model with a higher bias. Although it will reduce the risk of inaccurate predictions, the model will not properly match the data set. Hence it is a delicate balance between both biases and variance. But having a higher variance does not indicate a bad ML algorithm. Machine learning algorithms should be created accordingly so that they are able to handle some variance. Underfitting occurs when a model is unable to capture the underlying pattern of the data. Such models usually present with high bias and low variance. It happens when we have very little data to build a model or when we try to build a model with linear features making use of nonlinear data.


The benefits of Bare-Metal-as-a-Service for fintech

Dedicated servers are a better fit for resource-heavy apps. In the world of financial services, there’s a lot of transactions going on. Virtual machines are not the best choice for such an environment, since the “virtualisation tax” prevents you from using 100% of their capacity. Another issue is the distribution of the platform’s resources between users – when one of them uses too much of the server’s capacity, their neighbours pay for it. ... Bare metal solutions are often harder to order than a virtual machine, and you must wait longer for the server to be prepared for operation. Another issue is the management of the disparate infrastructure of dedicated servers, virtual machines and clouds when purchased from different providers. G-Core Labs’ new offering, Bare-Metal-as-a-Service, solves these problems. With this service, a user can get a ready-for-use dedicated server as easily as a virtual one. Just select the right features, connect a private or public network, or several networks at once, and in a few minutes, the physical server will be ready for use.


Israel’s fintech community readies for ‘dramatic’ changes in banking sector

The first calls for establishing “a unique regulatory sandbox” for fintech companies in which regulators will monitor their activities while hedging their risks, and allowing them to introduce products into the Israeli market to benefit consumers. The regulatory system proposal was coordinated by an inter-ministerial team led by the Justice and Finance ministries and included representatives from the Securities Authority, the Bank of Israel (BOI), the Capital Market Authority, the Anti-Money Laundering and Terrorist Financing Authority, and the Tax Authority. The second proposal — the one watched closely by Israeli fintech startups and the legacy banks — requires banks and financial entities to transfer information about their customers, with the customers’ approval, to technology firms that can provide these customers with information about the financial services they consume, how much exactly they are paying for them and how much they could save if they switch to another financial services provider.


5 Surefire Things That’ll Get You Targeted by Ransomware

Using a password manager has become a common practice for many, but it seems like there are a lot of people who unfortunately still don’t understand the risks. There are some valid concerns with using password managers in general—like losing access to your master file, having it fall into the wrong hands, or the issue with hosted services where your passwords are hosted by a third party. But all of those are minor compared to the issues that you’re bringing about by reusing passwords as an alternative. Sure, it’s convenient. But as soon as one of your accounts is compromised, you’re going to run into a lot of trouble on many fronts. And this happens more often than you might think; companies get attacked regularly, and credentials are leaked as a result. ... As an extension to the above, watch out for the kinds of contacts you make online. People might not be who they claim, and you should always keep an eye open for potential shady intentions. When you combine this with some of the above points, things can get quite scary. Some people might target you because they’ve gathered information about you from other sources, and they can make the whole interaction seem very natural and legitimate.


Utilising digital skills to tackle climate change

Upskilling is crucial to the major transition that the energy industry is currently going through. A 2020 report by EY on Oil and Gas Digital Transformation, found that 43% of respondents cite “too few workers with the right skills in the current workforce” as a major challenge to digital technology adoption. Upskilling will not only equip workers with new skills but also enable organisations to reach their digital transformation goals. By embracing the rapid change of innovation with upskilling, employers can take a proactive and agile approach to keep workforces engaged and employees focused on their own personal development. It’s not to say the skills that current workers hold are not useful for today’s needs, as many in energy industries possess transferable skills. Workers typically possess foundational knowledge in STEM fields and soft skills which can be integrated seamlessly into newer applications. For example, skills in the oil, gas and coal sectors can be brought into the growing renewable energy sector, offering a huge rise in job opportunities.



Quote for the day:

"Becoming a leader is synonymous with becoming yourself. It is precisely that simple, and it is also that difficult." -- Warren G. Bennis

Daily Tech Digest - September 05, 2021

Digital State IDs Start Rollouts Despite Privacy Concerns

To assuage security fears that come with storing people’s identity on its devices, Apple is asserting that state DLs and IDs stored in Wallet on iPhone and Apple Watch will “take full advantage of the privacy and security” built into the devices, the company said. Apple’s mobile ID implementation supports the ISO 18013-5 mDL, or mobile driver’s license standard being used by the government for storing digital identities. Apple played an active role in developing the standard, which the company said sets clear guidelines for the industry about how to protect consumers’ privacy when presenting an ID or driver’s license through a mobile device, the company said. Moreover, Apple devices will encrypt ID data to protect it against potential theft by threat actors, with DLs and IDs stored in Wallet presented digitally through encrypted communication directly between the device and the identity reader, the company said. This precludes the need for users to unlock, show or hand over their device to someone. Additionally, the use of Face ID and Touch ID will ensure that only the person who added the ID to the device can present it or view it on the device, according to Apple.


6 cybersecurity training best practices for SMBs

SMB owners and staff may know what cybersecurity risks are making the rounds—phishing, for example—but do they understand why these risks matter to the organization and themselves? Do they know what's required to reduce the risk? "It's important to note that raising security awareness is the goal," Poriete said. "Security communication, culture and training are different types of methods that can be used to help SMEs get there." Each company has to decide whether to develop the training in-house or find a consultant specializing in cybersecurity to recommend or create a training program specific to the company's needs. ... Learning about cybersecurity can be complex, and instructors provide too much information more often than not. The person responsible for training must avoid overloading employees with information they're unlikely to remember. "Training shouldn't be a one-off exercise but a regular activity to help maintain employees' level of awareness," Poriete said. "Think short, sharp exercises so as not to interrupt their core work or create security fatigue."


How Uber is Leveraging Apache Kafka For More Than 300 Micro Services

Uber has overcome the pub-sub message queueing system issues by implementing features via a client-side SDK. In addition, the team chose a proxy-based approach. The engineering team has taken a multiple programming approach with Go, Java, Python, and NodeJS services. While traditionally different services would be written in other languages for the various client libraries, Consumer Proxy makes it possible to implement only one programming language applicable to all services. This approach also makes it easier for the team to manage the 1000 microservices that Uber runs. Since the message pushing protocols remain unchanged, the Kafka team can upgrade the proxy at any time without affecting other services. Consumer Proxy also assists in limiting the blasting radius of rebalancing storms as a result of the rolling restart. It rebalances the consumer group by decoupling message consuming nodes from the message processing services. The service can eliminate the effects of rebalancing storms itself by implementing its group rebalance logic.


Deleting unethical data sets isn’t good enough

Scraping the web for images and text was once considered an inventive strategy for collecting real-world data. Now laws like GDPR (Europe’s data protection regulation) and rising public concern about data privacy and surveillance have made the practice legally risky and unseemly. As a result, AI researchers have increasingly retracted the data sets they created this way. But a new study shows that this has done little to keep the problematic data from proliferating and being used. The authors selected three of the most commonly cited data sets containing faces or people, two of which had been retracted; they traced the ways each had been copied, used, and repurposed in close to 1,000 papers. In the case of MS-Celeb-1M, copies still exist on third-party sites and in derivative data sets built atop the original. Open-source models pre-trained on the data remain readily available as well. The data set and its derivatives were also cited in hundreds of papers published between six and 18 months after retraction. DukeMTMC, a data set containing images of people walking on Duke University’s campus and retracted in the same month as MS-Celeb-1M, similarly persists in derivative data sets and hundreds of paper citations.


Cleveland Clinic develops bionic arm that restores ‘natural behaviors’

It enables patients to send nerve impulses from their brains to the prosthetic when they want to use or move it, and to receive physical information from the environment and relay it back to their brain through their nerves. The artificial arm’s bi-directional feedback and control enabled study participants to perform tasks with a similar degree of accuracy as non-disabled people. “Perhaps what we were most excited to learn was that they made judgments, decisions and calculated and corrected for their mistakes like a person without an amputation,” said Dr Marasco, who leads the Laboratory for Bionic Integration. “With the new bionic limb, people behaved like they had a natural hand. Normally, these brain behaviors are very different between people with and without upper limb prosthetics.” Dr Marasco also has an appointment in Cleveland Clinic’s Charles Shor Epilepsy Center and the Cleveland VA Medical Center’s Advanced Platform Technology Center.


Can healthcare avoid another AI winter?

"AI winter" refers to a period of disillusionment with AI, marked by reduced investments and progress, which follow periods of high enthusiasm and interest in AI technology. There have been two AI winters: one between the mid-1980s and early 1990s and another in the late 1970s and early 1980s, in which expert systems and practical artificial neural networks rose to prominence. However, it became clear that these expert systems had limitations that prevented them from living up to expectations. This resulted in the second AI winter, a period of decreased AI research funding and a decline in general interest in AI. According to the Gartner Hype Cycle, we now are at risk of another AI winter in healthcare due to several AI solutions falling short of their initial hype, including natural language processing, deep learning and machine learning, which is decreasing trust in AI by users. Recent examples that highlight the growing concern over inappropriate and disappointing AI solutions include racial bias in algorithms supporting healthcare decision-making, unexpected poor performance in cancer diagnostic support or inferior performance when deploying AI solutions in real-world environments.


These 'technology scouts' are hunting for the next big thing in tech. Here's how they do it

Setting up a strategy for discovering emerging technologies might seem like a daunting task, especially for smaller organisations, but a growing number of tools are now being built to help. Mergeflow, for example, is a Germany-based startup that automates the process of hunting for innovation. "People come to us because they know that there is something somewhere," Florian Wolf, the founder of Mergeflow, tells ZDNet. "It's pretty much all in the web, but you can't collect and analyse all of the data by yourself. It takes too long. You need automation to do that." Mergeflow's software, which was used by BMW to build the company's tech radar, scans thousands of scientific and technological publications, patents, news, market analyses, investor activities and other data every day. Users can search for a concept or a category and immediately access hundreds of potential innovations that are related to their query. The company's algorithm also looks at startups and companies working on each specific innovation to find out how mature they are, based on data like venture funding or collaborations with other researchers and inventors.


How to manage the growing costs of cyber security

Technological solutions aren’t the be all and end all of cyber security, but they do play a major role in an organisation’s defences. This is truer now than ever, as organisations find innovative ways to use tech. Cloud services have shifted into the mainstream in recent years, and they will only become more popular as businesses embrace remote working. Consider the fact that employees are now spread across the country or even across the globe, meaning countless new organisational endpoints, each of which is vulnerable to an attack and must be protected. These defences rely on continuous, end-to-end monitoring and the ability to analyse threat data from multiple sources in real time. Threat monitoring tools should work in combination with a variety of other technologies – including anti-malware, encryption tools and firewalls as part of a holistic approach to security. But that’s only one part of the equation. For these tools to be effective, organisations need experts to implement them correctly and respond appropriately to the data they gather.

IT Leadership: 10 Ways to Unleash Enterprise Innovation

Innovation never sleeps. It evolves, it accelerates, it takes different forms. In fact, organizations that want to unleash innovation are wise to discover what stifles it so they can remove the constraints. For example, innovation historically resided in research and development (R&D) departments, but now organizational leaders are more inclined to behave as through innovation can come from anywhere. In fact, some organizations believe in democratizing innovation so much that they encourage experimentation, host competitions and may even provide financial incentives. According to Jeff Wong, global chief innovation officer at multinational professional services network EY, CEOs are realizing they can't rely on a traditional innovation team when the context of company's competition has changed. For example, retail banks used to compete against each other by stealing each other's accounts, but the same tactic won't work when the new competition is cryptocurrencies or a social network that offers stored value or investment alternatives.


Applying Genetic Engineering to your Organization Culture

Organizational culture is the organization’s behavioural blueprint; we can also call it Organization DNA. It includes the unspoken instructions of how one should behave as part of the organization, those are the human behaviour boundaries in the working environment. This concept of hidden behavioural codes that are unique for each organization has been demonstrated many times; when an employee from one organization joins another organization, they sense those codes and change their behavior. One of the greatest challenges in finding a suitable mechanism for manipulating the behavioural codes was that most of the genetic engineering concepts did not work at scale. Many of the methods needed very specific indicators in order to allocate specific cells that were candidates for manipulation. Deepening the investigation, we came across a field of science called epi-genetic. This field explored the environmental influence on DNA replication, and scientifically proved that cell manufacturing is influenced not only by the DNA blueprint, but also by the cell environment. 



Quote for the day:

"A leadership disposition guides you to take the path of most resistance and turn it into the path of least resistance." -- Dov Seidman

Daily Tech Digest - September 04, 2021

AMD files teleportation patent to supercharge quantum computing

AMD has proposed a patent for 'teleportation,' meaning things could be about to get much more efficient around here. With the incredible technological feats humanity achieves on a daily basis, and Nvidia's Jensen going off on one last year about GeForce holodecks and time machines, it's easy for us to slip into a headspace that lets us believe genuine human teleportation is just around the corner. "Finally," you sigh, mouthing the headline to yourself. "Goodbye work commute, hello popping to Japan for an authentic Ramen on my lunch break." ... Essentially, the 'out-of-order' execution method AMD is looking to lay claim to ensures some Qubits that would be left idle—waiting for their calculation step to come around—are able to execute independent of a prior result. Where usually they would need to wait for previous Qubits to provide instructions, they can calculate simultaneously, no need to wait in line. So, no, we're not going to be zipping through wormholes just yet. But if AMD's designs come through, we could be looking at much more efficient, scalable and stable quantum computing architecture than we have now.


The Internet of Things Requires a Connected Data Infrastructure

Not long ago, a terabyte of information was an enormous amount and might be the foundation for solid decision-making. These days, it won’t cut it. For example, looking at a terabyte of data might yield a decision that’s 70% accurate. But leaving 30% to chance is unacceptable when it comes to real-time vehicle safety. On the other hand, having the ability to ingest and process 40 terabytes — from all sources, edge to core — can result in an accuracy rate well exceeding 90% accuracy. Something jumps in front of your car — is it a person, a dog, a trash bag, a child’s ball? Real-time systems need to determine the level of risk and react in micro milliseconds. Real-time processing has to be done closer to where the decisions are being made. In terms of IoT, a lot of questions can be answered by using a digital twin. These create additional layers of insights and provide a better understanding of what’s happening in any given situation and decide on the most appropriate course of immediate action. Digital twins take insight not just from the raw sensors — the edge compute nodes — but a combination of real-time data at the edge and historical data at the core.


Can Your Organization Benefit from Edge Data Centers?

Organizations considering a move to edge computing should begin their journey by inventorying their applications and infrastructure. It's also a good idea to assess current and future user requirements, focusing on where data is created and what actions need to be performed on that data. "Generally speaking, the more susceptible data is to latency, bandwidth, or security issues, the more likely the business is to benefit from edge capabilities," said Vipin Jain, CTO of edge computing startup Pensando. “Focus on a small number of pilot projects and partner with integrators/ISVs with experience in similar deployments." Fugate recommended examining business functions and processes and linking them to the application and infrastructure services they depend on. "This will ensure that there isn’t one key centralized service that could stop critical business functions," he said. "The idea is to determine what functions must survive regardless of an infrastructure or connectivity failure." Fugate also advised determining how to effectively manage and secure distributed edge platforms.

How to Speed Up Your Digital Transformation

The complexity-in-use is often overlooked in digitalization projects because those in charge think that accounting for task and system complexity independent of one another is enough. In our case, at the beginning of the transformation, tasks and processes were considered relatively stable and independent from the new system. As a result, the loan-editing clerks were unable to complete business-critical tasks for weeks, and management needed to completely reinvent their change management approach to turn the project around and overcome operational problems in the high complexity-in-use area. They brought in more people to reduce the backlog, developed new training materials, and even changed the newly implemented system — a problem-solving technique organizations with smaller budgets wouldn’t find easy to deploy. In the end, our study partner managed this herculean task, but it took them months to get the struggling departments back on track.


Ecosystems at The Edge: Where the Data Center Becomes a Marketplace

Rapidly evolving edge computing architectures are often seen as a way for businesses to enable new applications that require low latency and place computing close to the origin of data. While those are important use cases, what is less often discussed is the opportunity for businesses to leverage the edge to spawn ecosystems that generate new revenue. To realize this value, companies must think of the edge as more than just a collection point for data from intelligent devices. They should broaden their vision to see the edge as a new business hub. These small data centers can evolve into full-fledged service providers that attract local businesses, generate e-commerce transactions and enable interconnections that never touch the central cloud. Edge computing is an expansion of cloud infrastructure that moves data collection, processing and services closer to the point at which data is created or used. It is the fastest-growing segment of the cloud category with the total market expected to expand 37% annually through 2027, according to Grand View Research.


NSA: We 'don't know when or even if' a quantum computer will ever be able to break today's public-key encryption

In the NSA's summary, a CRQC – should one ever exist – "would be capable of undermining the widely deployed public key algorithms used for asymmetric key exchanges and digital signatures" – and what a relief it is that no one has one of these machines yet. The post-quantum encryption industry has long sought to portray itself as an immediate threat to today's encryption, as El Reg detailed in 2019. "The current widely used cryptography and hashing algorithms are based on certain mathematical calculations taking an impractical amount of time to solve," explained Martin Lee, a technical lead at Cisco's Talos infosec arm. "With the advent of quantum computers, we risk that these calculations will become easy to perform, and that our cryptographic software will no longer protect systems." Given that nations and labs are working toward building crypto-busting quantum computers, the NSA said it was working on "quantum-resistant public key" algorithms for private suppliers to the US government to use, having had its Post-Quantum Standardization Effort running since 2016. 

There are multiple ways that AI could become a detriment to society. Machine learning, a subfield of AI, learns from vast quantities of data and hence carries the risk of perpetuating data bias. AI use cases including facial recognition and predictive analytics could adversely impact protected classes in areas such as loan rejection, criminal justice and racial bias, leading to unfair outcomes for certain people. ... AI is only as good as the data that is used to train it. From an industry perspective, this is problematic given there is often a lack of training data for true failures in critical systems. This becomes dangerous when a wrong prediction leads to potentially life-threatening events such as manufacturing accidents or oil spills. This is why a focus on hybrid AI and “explainable AI” is necessary. ... Unfortunately, cybercriminals have historically been better and faster adopters of technology than the rest of us. AI can become a detriment to society when deepfakes and deep learning models are used as vehicles for social engineering by scammers to steal money, sensitive data and confidential intellectual property by pretending to be people and entities we trust.


Reviewing the Eight Fallacies of Distributed Computing

The challenges of distributed systems, and the broad science around the techniques and mechanisms used to build them, are now well researched. The thing you learn when addressing these challenges in the real world, however, is that academic understanding only gets you so far. Building distributed systems involves engineering pragmatism and trade-offs, and the best solutions are the ones you discover by experience and experiment. ... However, the engineering reality is that multiple kinds of failures can, and will, occur at the same time. The ideal solution now depends on the statistical distribution of failures; or on analysis of error budgets, and the specific service impact of certain errors. The recovery mechanisms can themselves fail due to system unreliability, and the probability of those failures might impact the solution. And of course, you have the dangers of complexity: solutions that are theoretically sound, but complex, might be far more complicated to manage or understand whenever an incident takes place than simpler mechanisms that are theoretically not as complete.


Machine Learning Algorithm Sidesteps the Scientific Method

We might be most familiar with machine learning algorithms as they are used in recommendation engines, and facial recognition and natural language processing applications. In the field of physics, however, machine learning algorithms are typically used to model complex processes like plasma disruptions in magnetic fusion devices, or modeling the dynamic motions of fluids. In the case of this work by the Princeton team, the algorithm skips the interim steps of needing to be explicitly programmed with the conventions of physics. “The algorithms developed are robust against variations of the governing laws of physics because the method does not require any knowledge of the laws of physics other than the fundamental assumption that the governing laws are field theories,” said the team. “When the effects of special relativity or general relativity are important, the algorithms are expected to be valid as well.” The researchers’ approach was inspired in part by Oxford philosopher Nick Bostrom’s philosophical thought experiment that the universe is actually a computer simulation.


What's the Real Difference Between Leadership and Management?

Leaders, like entrepreneurs, are constantly looking for ways to add to their world of expertise. They tend to enjoy reading, researching and connecting with like-minded individuals; they constantly aim to grow. They are usually open-minded and seek opportunities that challenge them to expand their level of thinking, which in turn leads to developing more solutions to problems that may arise. Managers, many times, rely on existing knowledge and skills by repeating proven strategies or behaviors that may have worked in the past to help maintain a steady track record within their field of success with clients. ... Leaders create trust and bonds between their mentees that go beyond expression or definition. Their mentees become raving fanatics willing to go above and beyond the usual scope of supporting their leader in achieving his or her mission. In the long run, the overwhelming support from his or her fanatics helps increase the value and credibility of the leader. On the other hand, managers direct, delegate, enforce and advise either an individual or group that typically represents a brand or organization looking for direction. Followers do as they are told and rarely ask questions. 



Quote for the day:

"Most people don't know how AWESOME they are, until you tell them. Be sure to tell them." -- Kelvin Ringold

Daily Tech Digest - September 03, 2021

What is a Botnet – Botnet Definition and How to Defend Against Attacks

Building a successful botnet requires thinking about what the goal is, whether it's creating a sustainable business plan, a target audience (whose devices are going to be infected, and what lure would appeal to them?), and processes to ensure the distribution and internal processes are secure. Then, a prospective botnet herder needs to start with a VPN service which takes anonymous forms of payment (possibly several services to rotate between). These services need to be unlikely to quickly hand over customer records and logs to any law enforcement agencies (a 'bulletproof' service). The next step is getting access to 'bulletproof' hosting (either a somewhat legitimate business which is *inefficient* at processing legal complaints or one specifically aimed at malware operators). Then, the herder needs domains from a registrar which will be unlikely to hand over customer information to law enforcement and which accepts anonymous methods of payment. Optionally, a herder can further disguise their activity with a technique like fast flux. Fast flux can either be single or double flux.


Soft Skills For Solution Architects — Moving Beyond Technical Competence

Solution Architects’ ability to Re-Imagine solution design, business processes, and customer journey along with Business Acumen would be one of the most important differentiators. You need to be innovative enough to design & deliver business functions while keeping business constraints, like time, budget, quality, and available human resources, in mind. Solution Architects need to challenge the existing processes and assumptions of the industry and reimagine new processes and the flow for customer journeys. Additionally, they need to possess the ability to emphasize customer experience over technology. Solution Architects need to shift the mindset and ensure that the product/service that the business offers is focused on decoding the needs and demands of their stakeholders rather than boating a technology that is difficult to traverse through. ... In the past, the Solution Architect role was seen as a bridge between Infra Architect, Network Architect, Security Architect, Storage Architect, Application Architect, and Database Architect. 


Low-Code and Open Source as a Strategy

Yes, there is a “but”. For instance, our system needs an existing database. The end application will also be database-centric, implying it’s typically for the most part only interesting for CRUD systems, where CRUD implies Create, Read, Update and Delete. However, the last figures I saw in regards to this was that there are 26 million software developers in the world. These numbers are a bit old, and are probably much larger today than a decade ago when I saw these figures. Regardless, the ratio is probably still the same, and the ratio tells us that 80% of these software developers work as “enterprise software developers.” An enterprise software developer is a developer working for a non-software company, where software is a secondary function. ... This implies that if you adopt Low-Code and Open Source as a strategy for your enterprise, you can optimize the way your software developers work by (at least) 5x, probably much more. Simply because at least 80% of the work they need to do manually is as simple as clicking a button, and waiting for one second for the automation process to deliver its result.


5 Rock-Solid Leadership Strategies That Drive Success

As a leader, one of the most important actions you can take is being fully engaged in your company. All too often, leaders lose touch with the nuts and bolts of their businesses. Many millenials tend to be over-delegators, and they delegate almost every component of their business to the point they are not able to make the right high-level decisions for their business. This is because they lack a clear understanding of what is happening at the ground level. The front-line workers of an organization tend to be the ones who are directly interacting with customers. When leaders rely on their executive team to find out front-line information, there is much that can get lost in translation. A fully engaged leader knows exactly what is happening on the front line of his or her company and doesn’t hide in an ivory tower and rely on others to get a pulse for the business. Full engagment in your company requires discipline as well as humility. A fully engaged CEO is one that regularly communicates directly to the front-line workers and listens carefully. 


Bluetooth Bugs Open Billions of Devices to DoS, Code Execution

One of the DoS bugs (CVE-2021-34147) exists because of a failure in the SoC to free resources upon receiving an invalid LMP_timing_accuracy_response from a connected BT device (i.e., a “slave,” according to the paper: “The attacker can exhaust the SoC by (a) paging, (b) sending the malformed packet, and (c) disconnecting without sending LMP_detach,” researchers wrote. “These steps are repeated with a different BT address (i.e., BDAddress) until the SoC is exhausted from accepting new connections. On exhaustion, the SoC fails to recover itself and disrupts current active connections, triggering firmware crashes sporadically.” The researchers were able to forcibly disconnect slave BT devices from Windows and Linux laptops, and cause BT headset disruptions on Pocophone F1 and Oppo Reno 5G smartphones. Another DoS bug (CVE pending) affects only devices using the Intel AX200 SoC. It’s triggered when an oversized LMP_timing_accuracy_request (i.e., bigger than 17 bytes) is sent to an AX200 slave.


9 notable government cybersecurity initiatives of 2021

In January, the US Department of Defense (DoD) released the Cybersecurity Maturity Model Certification (CMMC), a unified standard for implementing cybersecurity across the defense industrial base (DIB), which includes over 300,000 companies in the supply chain. The CMMC reviews and combines various cybersecurity standards and best practices, mapping controls and processes across several maturity levels that range from basic to advanced cyber hygiene. “For a given CMMC level, the associated controls and processes, when implemented, will reduce risk against a specific set of cyber threats,” reads the Office of the Under Secretary of Defense for Acquisition & Sustainment website. “The CMMC effort builds upon existing regulation (DFARS 252.204-7012) that is based on trust by adding a verification component with respect to cybersecurity requirements.” The CMMC is designed to be cost-effective and affordable for all organizations, with authorized and accredited CMMC third parties conducting assessments and issuing CMMC certificates to DIB companies at the appropriate level.


In-Memory Database Architecture: Ten Years of Experience Summarized

Tarantool also has an ACID transactions mechanism. Arrangements for single-threaded access to data enable us to achieve ‘serializable’ isolation level. When we call Arena, we can write to or read from it, or modify data. All that happens is done consecutively and exclusively in one thread. Two fibers cannot be executed in parallel. As far as interactive transactions are concerned, there is a separate MVCC engine. It makes it possible to execute interactive transactions in serializable mode; however, potential conflicts between transactions will need to be additionally handled. Apart from the Lua access engine, Tarantool has SQL. We have often used Tarantool as a relational database. We realized that we designed the database according to relational principles. We used spaces where SQL used tables. That is, each row is represented by a tuple. We have defined a schema for our spaces. It became clear to us that we can take any SQL engine, and just map primitives and execute SQL on top of Tarantool. In Tarantool, we can invoke SQL from Lua. We can either use SQL directly or call what was defined in Lua from SQL.


Low code cuts down on dev time, increases testing headaches

Ironically, the draw of low-code for many companies is that it allows anyone to build applications, not just developers. But when bugs arise citizen developers might not have the expertise needed to resolve those issues. “Low-code solutions that are super accessible for the end-user often feature code that’s highly optimized or complicated for an inexperienced coder to read,” said Max de Lavenne, CEO of Buildable, a custom software development firm. “Low-code builds will likely use display or optimization techniques that leverage HTML and CSS to their full extent, which could be more than the average programmer could read. This is especially true for low-code used in database engineering and API connections. So while you don’t need a specialized person to test low-code builds, you do want to bring your A-team.” According to Isaac Gould, research manager at Nucleus Research, a technology analyst firm, a citizen developer should be able to handle testing of simple workflows. Eran Kinsbruner, DevOps chief evangelist at testing company Perforce Software, noted that there could be issues when more advanced tests are needed. 


Digital transformation – it’s a people problem

Reinbold says that it is vital to “shrink the change you’re trying to accomplish” once momentum towards change has been achieved: “I’ve seen way too many efforts, declare some grandiose, ‘burn the boats’ type of initiatives like, ‘Everybody, for all time, is going to do this thing and only this thing’. “And as you might imagine, the amount of pushback to something like that is as absolutely proportional to the size of the change that is being asked for. It might be necessary, but in order to get traction, you have to build positive momentum.” His advice? Start with the uncontroversial stuff: “Ratify your process, whatever the means is – forgetting that thing accepted and communicated and monitored and policed – whatever that tiny thing is, have it be uncontroversial because you’re still figuring out how all of this works. ... The next step would be to script the critical moves. Your transformation efforts may make great viewing at 50,000 feet, but for employees in the trenches who might not understand where they are and where they need to be, the work they’re doing towards change could be confusing – and it might not make sense in their view.


Critical infrastructure today: Complex challenges and rising threats

Critical infrastructure systems face twin burdens of often having fewer resources to invest in cybersecurity, and the very critical nature of their operations, which attract adversaries and focus attention on any disruptions. When combined with the increasing connectivity of these resources and assets, organizations find themselves in a tough spot where they are targeted more often by adversaries ranging from criminal elements to state-directed entities. Low margins for error, high visibility (when systems fail or are compromised), and poor resourcing combine to make a complex defensive picture. ... Overall, current efforts appear to move the sector in the right direction by increasing focus and making resources available for defense. Where matters get tricky is the distinction between government-directed efforts and privately-owned infrastructure operators. Ultimately, government action short of legal mandates or similar actions will only go so far in addressing issues absent actions from critical infrastructure asset owners and operators. 



Quote for the day:

"The ability to summon positive emotions during periods of intense stress lies at the heart of effective leadership." -- Jim Loehr