Daily Tech Digest - June 30, 2020

3 Reasons Why Traceability Should Be a DevOps Priority

One thing that has to be central to your strategy is traceability. You may have come across the term a few times before. It’s commonly used elsewhere in the business world, especially with regard to supply chains. Basically, what it means is keeping track of a commodity or product at every stage of the production process. Records of the product’s entire manufacturing and distribution history are kept so that the sources of any problems can later be determined and dealt with. Traceability thereby ensures that suppliers can act quickly and decisively in the event of a product recall, for example. Another advantage of traceability is that it provides additional transparency, which helps to maintain consumer confidence. As consumers are becoming increasingly aware of how products are sourced and manufactured, this is now an important consideration. It reassures consumers that manufacturers and suppliers are aware of their concerns and that they’re looking out for their best interests. You can see already, then, how much of this also applies to mobile DevOps. Traceability in DevOps is about ensuring clarity, accountability and the best possible end product for the consumer. 


After this COVID winter comes an AI spring

Companies emerging from this recession will adapt processes to “vaccinate” their systems against the next pandemic. In response to supply-chain disruptions, Volkswagen is considering increasing its 3D printing capabilities in Germany, which would give the automaker a redundant parts source. The government-run Development Bank of Japan will subsidize the costs of companies that move production back to Japan. Bringing production back onshore while controlling costs will require significant investment in robotics and AI. Even companies that don’t have their own production capacity, such as online retailers, plan to use AI to improve the reliability of complex global supply chains. So a surge in demand for AI talent is inevitable. ... One relatively new risk that managers must tolerate pertains to data. Even companies that are not yet exploiting their data effectively now recognize it as a valuable resource. As startups deploy AI software systems that prove more accurate and cost-effective than human beings, their early-adopter customers must be more willing to trust them with proprietary data. That will allow AI companies to train new products and make them even smarter.


Tackling Fragmentation in Serverless Data Pipelines

Within the AWS ecosystem, a number of services stitched together provide this experience. And on the analytics team at Equinox Media on which I sit, we’ve embraced this architectural pattern to it’s fullest — foregoing self-maintained, provisioned servers to handle data processing — and opting instead for a parade of SQS queues, SNS topics, Kinesis streams, and of course, Lambda functions. As a result, diagrams of our data pipelines bear a visual resemblance to a 6th grader’s Rube Goldberg project. And as the metaphor suggests, this paradigm presents new organizational challenges to keep maintenance costs low. When adopting the serverless platform, one thing you’ll quickly notice is a proliferation in the number of code repositories your team is maintaining. This is the result of the a common development pattern that calls for a 1:1 ratio of Lambda functions to repos. And while there are benefits to having your business logic fragmented into digestible, bite-sized chunks of code; there are a number of supporting services that are best not replicated and distributed among them. 


Create Symbiotic Relationships with AI in Business

When humans have specific types of problems, we’ve built and trained machines to solve those problems. Examples include machine learning or ML. The ML algorithms that can identify cancer in brain images. The algorithms can also determine the best placements or designs for online ads, and there are deep learning systems that can predict customer churn in business. At the moment, we can only imagine how much more productive we will become as we form symbiotic relationships with AI. Routine tasks that currently take hours or days could be abbreviated to 10 or 15 minutes with the aid of a digital partner. From simple exercises like finding a new restaurant to more expert tasks such as cancer detection, we will increasingly rely on machines for everyday tasks. Dependence on machines might begin as a “second pair of eyes” or “ a second opinion,” but our commitment to machines (and AI) will evolve into full-on digital collaborators. ... Machine learning could bring about a revolution in how we solve problems to which the principle of “optimal stopping” applies. 


Battling Cybercriminals on the ‘Digital Frontline’

People have a degree of protection when they are sitting amongst their colleagues. When suspicious emails come in, it is far easier to speak to a colleague and verify its authenticity. However, as people are now working from home, and they are isolated and often alone, that becomes much harder. Where web and email has been the traditional vector for these kinds of attacks, we are now seeing phishing attempts across multiple platforms, including social media and SMS. Every nation is being targeted and phishing emails appear in almost every language. In many ways, this is the largest set of cyber campaigns we have ever seen. Many of these emails offer falsified information or promises of help related to the pandemic. In one campaign found by Proofpoint, they even promise cures – which is something that malicious actors know the public are interested in and are likely to immediately pay attention to. These attackers are after personal information from anyone and everyone such as login credentials, name, date of birth and government ID details, or want to trick victims into installing malware on systems. A mixture of old, reskinned and relatively new malware is being used to attack users. We are looking at a cybercrime gold rush.


Where Tech Meets Community – Harnessing Tech For Good

Indeed, it is when talent, technology and collaboration come together, that incredible advances can be achieved and at scale. This is exemplified in the solidarity of the technology sector to make a difference, bringing people closer across work, learning and entertainment despite lockdowns, and combating the virus through telemedicine and AI-assisted diagnosis, alongside helping to accelerate the research and drug development innovation curve. A notable example is the rapid establishment of the HPC Consortium involving 11 tech firms assisting federal government, industry and academic leaders across the world with access to expertise and high performance computing capacity. With a mobilization such as this, it is no surprise that by early April 2020, 50 potential vaccines and nearly 100 possible treatment drugs were in development. A feat that would have been unimaginable just a few weeks ago and emergency initiatives and innovations like this can also lay the ground for long term change, from business and education, to healthcare and government.


Prepare for the rise of the IT automation architect

IT automation architects are typically found in DevOps organizations. It's fruitless to focus on a comprehensive automation strategy without a cooperative, integrated DevOps structure already in place. Because of the specialized nature of the job, architects are typically found in larger enterprises or those, like many cloud-native startups, that have mature DevOps practices. There's a wide variety of job titles and associated skills found under the DevOps umbrella. For example, a recent DevOps skills report from the DevOps Institute, a learning association for DevOps professionals, identified more than a dozen DevOps job titles for which organizations are hiring. "DevOps engineer/manager" was the most common title, cited by 51% of survey respondents -- who were comprised of IT professionals, DevOps practitioners, HR managers and consultants. "Automation architect" was the 9th most cited job title at 15%. The following chart summarizes other notable job titles and their response rates. When the same group of survey respondents was asked to rate the importance of various skills to DevOps work, proficiency at automation ranked at the top, with 66% citing it as very important and only 1% listing it as optional or unimportant.


How the COVID-19 Pandemic Will Propel Humanity 20 Years Ahead in Tech

Once executives around the world realize that their employees can not only work in online-first environments but are thriving and being even more productive, with greater opportunities for collaboration with their peers, they will embrace this “new” way of doing business. That, in turn, will unlock many benefits of scale and productivity that were unimaginable in the previous decades. The key driver of change will be that, now, every vendor or business partner can be assumed an online-first operator, and dozens and hundreds of legacy barriers will disappear practically overnight. Essentially, every business on the planet not only can, but will run like a Silicon Valley startup. Imagine, instead of attending five conferences a year, we can attend and collaborate at 50 virtual conferences while being more efficient with our time, given the removal of all that unnecessary travel. Imagine, if instead of a few business development conversations in a given quarter, we are able to do one hundred, now that the vast majority of our peers are in the same Slack or Telegram groups. Imagine that instead of a few dozen local restaurants, we will now have the choice to order from thousands.


Massive complexity endangers enterprise endpoint environments

In addition to heightening risk exposure, the failure of critical endpoint controls to deliver their maximum intended value is also resulting in security investments and, ultimately, wasted endpoint security spend. According to Gartner, “Boards and senior executives are asking the wrong questions about cybersecurity, leading to poor investment decisions. It is well-known to most executives that cybersecurity is falling short. There is a consistent drumbeat directed at CIOs and CISOs to address the limitations, and this has driven a number of behaviors and investments that will also fall short.” “What has become clear with the insights uncovered in this year’s report is that simply increasing security spend annually is not guaranteed to make us more secure,” said Christy Wyatt, President and CEO of Absolute. “It is time for enterprises to increase the rigor around measuring the effectiveness of the investments they’ve made. By incorporating resilience as a key metric for endpoint health, and ensuring they have the ability to view and measure Endpoint Resilience, enterprise leaders can maximize their return on security investments.”


Q&A on the Book Becoming an Effective Software Engineering Manager

It's all about getting oriented and understanding the team, the work they're doing, and the company. I typically use a process which can be followed when landing somewhere new. It involves creating a snapshot of the situation in which you can begin to work with your team. This snapshot is formed of three things: your own observations, your manager's observations, and your team's observations. Your observations are what you see as you settle in and collect information from your team and your manager. We outline a number of techniques for new managers to ask questions to discover what's really going on inside the team, what they're working on, and where there may be ambiguities or frictions. These involve informal conversations, booking in weekly one-to-one meetings, and diving deeper into what they're building and why. Then, as well as doing this downward, we also do this upward by having the new manager ask their manager about the same things. Do they think differently than what the team reports? Why? Are they prioritizing well? If not, why not?



Quote for the day:

"There's a fine line between stubbornness and the positive side of that, which is dogged determination." -- @JebBush

Daily Tech Digest - June 29, 2020

EU Commission: The GDPR has been an overall success

The GDPR proved to be flexible to support digital solutions in unforeseen circumstances such as the Covid-19 crisis. The report also concludes that harmonisation across the Member States is increasing, although there is a certain level of fragmentation that must be continually monitored. It also finds that businesses are developing a compliance culture and increasingly use strong data protection as a competitive advantage. The GDPR has acted as a catalyst for many countries and states around the world – e.g., Chile, South Korea, Brazil, Japan, Kenya, India, Tunisia, Indonesia, Taiwan and the state of California – to consider how to modernise their privacy rules, the EC noted. They also pointed out that it provided data protection authorities many corrective powers to enforce it (administrative fines, orders to comply with data subject’s requests, bans on processing or the suspension of data flows, etc.) There is room for improvement, though. “For example, we need more uniformity in the application of the rules across the Union: this is important for citizens and for businesses, especially SMEs. We need also to ensure that citizens can make full use of their rights,” noted Didier Reynders, Commissioner for Justice.


AI experts say research into algorithms that claim to predict criminality must end

A coalition of AI researchers, data scientists, and sociologists has called on the academic world to stop publishing studies that claim to predict an individual’s criminality using algorithms trained on data like facial scans and criminal statistics. Such work is not only scientifically illiterate, says the Coalition for Critical Technology, but perpetuates a cycle of prejudice against Black people and people of color. Numerous studies show the justice system treats these groups more harshly than white people, so any software trained on this data simply amplifies and entrenches societal bias and racism. “Let’s be clear: there is no way to develop a system that can predict or identify ‘criminality’ that is not racially biased — because the category of ‘criminality’ itself is racially biased,” write the group. “Research of this nature — and its accompanying claims to accuracy — rest on the assumption that data regarding criminal arrest and conviction can serve as reliable, neutral indicators of underlying criminal activity. Yet these records are far from neutral.” An open letter written by the Coalition was drafted in response to news that Springer, the world’s largest publisher of academic books, planned to publish just such a study.


Embedding ESG into banks’ strategies

Bank CEOs know they need to act. In fact, in a global survey by KPMG International in autum last year (before COVID-19), almost three‑quarters of banking CEOs said they believed their future growth will be largely determined by their ability to anticipate and navigate the shift to a low-carbon, clean-technology economy. However, most are struggling to come to grips with what that really means for their bank going forward. Take the transition risk, for example. Bank executives understand the "new reality" will require them to pivot their finance towards greener and more sustainable companies and investments. But they also know they can’t just flick a switch; they still have significant books of business wrapped up in loans and instruments to ‘brown’ assets. As long as those brown assets continue to generate profits for the bank, bank executives will need to balance their duty to finance the ESG transition against their fiduciary duties to shareholders. Banks, regulators and politicians are also struggling to understand all of the potential unintended consequences of their shift towards more ESG-related business strategies. Declining to renew loans on existing coal mines, for example, may improve a banks’ carbon disclosures. 


Facebook, IoTeX, R3 Among New Members of Confidential Computing Consortium

“Confidential computing brings privacy-preserving smart devices to the next level by not only allowing users to own their private data, but also to use it in a privacy-preserving way,” Raullen Chai, CEO of IoTex, told CoinDesk in an email. “This has major implications for consumer-facing industries such as health care and smart homes, as well as enterprise for private multi-party data sharing and interactions.” Chai, based in San Francisco, said there are two immediate use cases where confidential computing could make an impact on everyday people’s privacy.  One is facial recognition in public spaces, an area that is under intense debate and scrutiny, particularly as protests against police brutality continue in the U.S. There are traditionally two sides to this debate, said Chai. On one side are privacy-conscious people who don’t want images of their faces scanned and analyzed by governments and other actors. On the other are governments (their supporters) who, broadly, are prepared to sacrifice people’s privacy in the name of public good. Confidential computing has something for each hand.


Technology for a no-touch world

Now that machines can understand us almost as well as another human, we’ll see the technology take us back to a virtual version of the old days. We’ll be able to walk into an elevator and simply say, in any language, “Tenth floor, please.” Vending machines were invented to automate things such as candy and ticket stands, which were operated by clerks who people could speak to. In the coming years, we’ll again ask for what we want instead of pushing a button, but we won’t be talking to a person. Paris-based Thales, for example, is marketing its Transcity voice-recognition ticket machine to train stations: Travelers speak to tell it where they want to go, and it prints their ticket. Next-generation ATMs will veer toward becoming virtual tellers, according to Doug Brown, an executive at ATM maker NCR, who spoke about the technology in a recent news article. ... The effort to get machines to recognize faces also goes back to the 1960s, when an inventor named Woody Bledsoe, possibly funded by the CIA, laid down some of the field’s foundational research and dreamed of wearing glasses that would tell him the names of everyone he met. But as with speech technology, computers then didn’t have enough power or data or clever enough programming to make facial recognition work.


Is working from home the death knell for offices?

What will happen once the coronavirus is brought under control, either by a vaccine, or by an effective treatment? Will companies encourage staff to continue working from home and cut the amount of office space they occupy? Recent headlines suggest some companies have already decided to downsize their offices. Companies have a big incentive to cut their office use. Bills for rent, service charges, and utilities are all meaningful costs. Staff also benefit by spending less on commuting and having more time at home. The average one way commute in both the UK and US takes half an hour. Furthermore, even a small fall in the number of cars can lead to a significant improvement in traffic flow, and fewer cars means cleaner air. Yet, if remote working is such an obvious win-win for both businesses and their staff, why were companies so slow to adopt it before the coronavirus? The pressure on companies to economise is nothing new. Email and video conferencing have existed for 25 years, albeit the technology could be unreliable in its early years. Part of the answer may have been to do with control, and concerns that less conscientious staff would take advantage of remote working. Being seen in the office can also help employees.


A Changing World Requires a Changing View of Security

Moving security hygiene further up the to do list has to be paramount or all the effort to innovate and progress will be wasted. It really won’t take much to be breached. A DDoS attack can create large volumes of ‘garbage’ traffic to saturate the pipe and attack the intricacies of the VPN protocol. A flow as little as 1Mbps can knock a VPN service offline. No business will want to risk a breach that interferes with trading, nor can they afford any data exposure. It’s therefore really important to look back at what has been achieved and fine tune the processes and solutions in play and adapt the associated risk models. Some companies won’t be able to think about this right now such is the urgency to keep the business operating. But they must return to it, or employ the skills to do an audit, before moving on to the more strategic implementations they’ve proven they are capable of delivering. It would be foolish to roll out anymore transformation with emphasis on access and usability yet neglect security. The companies that ride this storm will be the ones that have the right technology, implementations, and skills in place. They will be the ones that deliver new operational models and innovate in ways their competition can’t.


The Secret of Simple Code

You can write code that is more reusable and less likely to break when new requirements are introduced and things change in the surrounding code. The secret to being 10x more productive is to gain a mastery of abstraction. A lot of developers treat “abstraction” like it’s a dirty word. You’ll hear (otherwise good) advice like, “don’t abstract too early” or Zen of Python’s famous “explicit is better than implicit,” implying that concrete is better than abstract. And all of that is good advice — depending on context. But modern apps use a huge amount of code. If you printed out the source code of modern top 10 applications, those stacks of paper would compete with the height of skyscrapers, and software costs a lot of money to maintain. ... Imagine being the coder who popularized the use of the map operation in programming languages like JavaScript. Map abstracts away details such as the type of data you’re mapping over, the type of data structure containing the data, and the iteration logic required to enumerate each data node in the data structure. It’s improved the efficiency of every app I’ve built in the past decade.


Attackers Target Vulnerable Exchange Servers

"As these attacks show, Exchange servers are high-value targets. These attacks also tend to be advanced threats with highly evasive, fileless techniques," Hardik Suri, a researcher with the Microsoft Defender ATP Research Team, writes in the blog. "The security update that fixes this vulnerability has been available for several months, but, notably, to this day, attackers find vulnerable servers to target." After gaining access to a vulnerable Exchange server, attackers deploy web shells - malicious code written in common programming languages - into one of the many web accessible paths on the server, Microsoft reports. This enables hackers to steal data or perform malicious actions for further compromise. Microsoft found that common access paths for web shell deployment were ClientAccess and FrontEnd directories, which provide services such as Outlook on the web, the Exchange Admin Center and AutoDiscover. A common web shell being used in the attacks is the credential-stealer China Chopper, which is hidden in the system using common file names, the blog notes.



Spring Boot 2.3.0 Focuses on the Cloud

Spring Boot has released version 2.3.0 which adds support for Docker with buildpacks, layered images, graceful shutdown support, liveness, and readiness probes. Another noteworthy change is the support for Java 14 while maintaining support for LTS versions 8 and 11. Buildpacks are an alternative to Dockerfiles. Buildpacks automatically detect the software needed to run the application in a Docker container. For example, it detects the version of Java used in the application. Based on that version, the buildpack will select the JRE specified in the buildpack and build a Docker image. ... Developers usually store the application artifact as a JAR file. The disadvantage is that the JAR file contains elements that often change, such as the code. But the JAR file also contains elements that change less frequently such as the dependencies. Changes between versions in Docker images are stored as diffs. When JAR files are stored for each version of the application, then the diff is quite big and consume a lot of disk space. Buildpacks reduce the space required by splitting the application into multiple layers based on what changes more frequently.




Quote for the day:

"Remember no one can make you feel inferior without your consent." -- Eleanor Roosevelt

Daily Tech Digest - June 28, 2020

Reinventing the organization for speed in the post-COVID-19 era

Just because the times are fraught does not mean that leaders need to tighten control and micromanage execution. Rather the opposite. Because conditions are so difficult, frontline employees need to take on more responsibility for execution, action, and collaboration. But this isn’t always easy and requires that organizations focus on building execution muscle throughout the workforce. Leaders must assign responsibility to the line, and drive “closed-loop accountability.” That is, everyone working on a team must be clear about what needs to get done by whom, when, and why. Employees must also be equipped with the right skills and mindsets to solve problems, instead of waiting to be told what to do. And there must be disciplined follow-up to make sure actions were taken and the desired results achieved. CEOs who are serious about execution excellence are investing in helping their workforces up their execution game—through targeted programs, realigning incentives, and directing rewards and recognition to teams that execute with speed and excellence. Building execution excellence does not have to come at the expense of innovation. Quite the contrary: it can help discover powerful ideas and innovation from the frontline teams that are closest to the customer. And it can drive excitement and loyalty among the employee base.


Are Tech Giants With Their AIs And Algorithms Becoming Too Powerful?

This reality is why large tech companies have extraordinary power today. Current regulatory mandates were built for corporations in the past where the market was the consideration, not forms of power. Susskind argues that we need to see technology not just as consumers, but as citizens. At the same time, social media can affect one of the most fundamental aspects of democracy, which is deliberation and the way we talk to each other. We've seen people become polarized because through their own personal choices, algorithms are making choices for them, and they are fed information that reinforces their own world view. We've seen people become more entrenched in those views because the more time you spend around people and information that agree with you, the more deeply you come to hold those views. There's also a significant problem with the spread of fake news and misinformation. In a sense, it isn't surprising that this has happened. These social media platforms have not been developed according to the principles of the forum or of healthy public debate. If that was so, they would funnel information to you that was balanced, fair, and rigorously checked or otherwise engineered to make you a better citizen.


Artificial Human Beings: The Amazing Examples Of Robotic Humanoids And Digital Humans

As artificial intelligence continues to mature, we are seeing a corresponding growth in sophistication for humanoid robots and the applications for digital human beings in many aspects of modern-day life. ... Even though the earliest form of humanoid was created by Leonardo Da Vinci in 1495 (a mechanical armored suit that could sit, stand and walk), today's humanoid robots are powered by artificial intelligence and can listen, talk, move and respond. They use sensors and actuators (motors that control movement) and have features that are modeled after human parts. Whether they are structurally similar to a male (called an Android) or a female (Gynoid), it’s a challenge to create realistic robots that replicate human capabilities. The first modern-day humanoid robots were created to learn how to make better prosthetics for humans, but now they are developed to do many things to entertain us, specific jobs such as a home health worker or manufacturer, and more. Artificial intelligence makes robots human-like and helps humanoids listen, understand, and respond to their environment and interactions with humans. Here are some of the most innovative humanoid robots in development today


Why Companies Still Struggle To Incorporate AI Into Existing Business Models

Cutting-edge companies are already finding patterns in user behaviour that can lead to exceptional products or features in existing products, which is giving them an extreme advantage over other businesses. Take computer vision (CV) for example. With computer vision, we can create a system that does a subset of things that the human visual system can do. In CV, a system can analyse a picture taken by a camera and understand what’s in the picture. For example, it can recognise objects like cars, streetlights and of course people. Computers can perform object recognition through a network of nodes called neural networks. An image can be fed into the network, and convolution happens at these nodes. This kind of technology can be used for various business scenarios and lead to incredible amounts of productivity and efficiency. For example, you can leverage computer vision-based licence plate recognition to run an automated car parking business. Of course, the information from registration, billing and computer vision-based license plate recognition systems would have to be integrated to automate the entire process.


Why the coronavirus pandemic confuses AI algorithms

The coronavirus lockdown has broken many things, including the AI algorithms that seemed to be working so smoothly before. Warehouses that depended on machine learning to keep their stock filled at all times are no longer able to predict the right items that need to be replenished. Fraud detection systems that home in on anomalous behavior are confused by new shopping and spending habits. And shopping recommendations just aren’t as good as they used to be. To better understand why unusual events confound AI algorithms, consider an example. Suppose you’re running a bottled water factory and have several vending machines in different locations. Every day, you distribute your produced water bottles between your vending machines. Your goal is to avoid a situation where one of your machines is stocked with rows of unsold water while others are empty. ... This new AI algorithm is much more flexible and more resilient to change, and it can predict sales more accurately than the simple machine learning model that was limited to date and location. With this new model, not only are you able to efficiently distribute your produced bottles across your vending machines, but you now have enough surplus to set up a new machine at the mall and another one at the cinema.


The importance of peer feedback in the digital workplace

As the way we work shifts, employees’ prior strengths may become liabilities, so it’s important to monitor behaviors over time and under different circumstances. Someone who excelled at building relationships through watercooler chitchat will need to find new methods when the work group goes completely virtual. Likewise, the individual who was overlooked as too socially awkward may begin to shine in a remote working environment. Employees will need feedback on how effective their behavior is in this new world so they can learn which new behaviors they may need to adopt and which may now be seen as strengths. Peer reviews help people understand better how to adjust to new technologies, even as the technology itself is becoming part of the process. For example, I recently coached a business unit chief financial officer (CFO) of a Fortune 500 company in the U.S. who had been passed over for a promotion in the middle of 2019. His 360 feedback results in 2019 made it clear he was struggling with his peer relationships. He was whip-smart, and everyone knew it — but his peers felt he was too quick to show them up in meetings with the senior leadership team.


Technology and innovation: Building the superhuman agent

Proactive conversational AI platforms can resolve requests before the customer even feels the need to reach out. Modern solutions integrated with various data systems can analyze large quantities of internal and external data and identify triggers to start proactive and personalized conversations through a customer’s preferred channels. For example, a leading telco was able to eliminate 50 percent of unnecessary service calls and inbound calls related to repairs by using robotics to proactively contact customers and resolve issues as soon as remote monitoring detected a malfunction. Two-thirds of customers believe service through online channels and mobile devices should be faster, more intuitive, and better able to serve their needs.1 Organizations should seize the opportunity with improved front-end robotics or “virtual agents” to handle repetitive, transactional requests as well as to guide customers through a logical menu of topics and intentions to address issues. Companies that have incorporated such technologies are seeing significant returns: in fact, effectively deploying conversational AI can create a twofold improvement in customer experience; reduce cost to serve by 15 to 20 percent; improve churn, upsell, and acquisition by 10 to 15 percent; and result in a fourfold increase in employee productivity.


How to establish a threat intelligence program

One of the first steps towards establishing a threat intelligence program is to know your risk tolerance and set your priorities early, he says. While doing that, it’s important to keep in mind that it’s not possible to prevent every potential threat. “Understand what data is most important to you and prioritize your limited resources and staff to make workloads manageable and keep your company safe,” he advised. “Once you know your risk tolerance you need to understand your environment and perform a comprehensive inventory of internal and external assets to include threat feeds that you have access to. Generally, nobody knows your organization better than your own operators, so do not go on a shopping spree for tools/services without an inventory of what you do/don’t have. After all that’s out of the way, it’s time to automate security processes so that you can free your limited talented cybersecurity personnel and have them focus their efforts where they will be most effective. “Always be on the lookout for passionate, qualified and knowledge-thirsty internal personnel that WANT to pivot to threat intelligence and develop them.


Why organizations should consider HTTPS inspection to find encrypted malware

Setting up HTTPS inspection can be tricky as it does require some extra effort. And if not configured correctly, this process can actually weaken the end-to-end encryption and protection provided by security gateways and products. "Some organizations are reluctant to set up HTTPS inspection due to the extra work involved, but our threat data clearly shows that a majority of malware is delivered through encrypted connections and that letting traffic go uninspected is simply no longer an option," Corey Nachreiner, chief technology officer at WatchGuard, said in a press release. "As malware continues to become more advanced and evasive, the only reliable approach to defense is implementing a set of layered security services, including advanced threat detection methods and HTTPS inspection." A report from the US Department of Homeland Security's Cybersecurity and Infrastructure Security Agency (CISA) offers some recommendations on HTTPS inspection. "Organizations using an HTTPS inspection product should verify that their product properly validates certificate chains and passes any warnings or errors to the client," CISA said.


Ex-Windows chief: Here's why Microsoft waged war on open source

Smith, a top lawyer at Microsoft during its war on open source, admitted earlier this month that the company was wrong but said it had now changed, pointing to its acquisition of GitHub and the company's open-source activities on the code-sharing site. Now Sinofsky, who has a new book detailing Microsoft's antitrust and security problems during his years overseeing Windows and Office, has attempted to put some context around Microsoft's new attitude and its old antagonism to open source.  Microsoft today has espoused open source as its focus shifts from Windows PCs to Azure and Office in the cloud. But Sinofsky outlines reasons why Microsoft's approach at the time was understandable – and how its model was upended by software-as-a-service in 1999-2000, to which Linux was better suited than Windows, and later Google's infrastructure. Sinofsky's defense of Microsoft fleshes out Gates' explanation of GPL in 2001 that it "makes it impossible for a commercial company to use any of that work or build on any of that work". "Microsoft was founded on the principle that software was intellectual property," Sinofsky says



Quote for the day:

"If you focus on results, you will never change. If you focus on change, you will get results." -- Jack Dixon

Daily Tech Digest - June 27, 2020

Machine Learning Has a Huge Flaw: It’s Gullible

“Patent examiners face a time-consuming challenge of accurately determining the novelty and nonobviousness of a patent application by sifting through ever-expanding amounts of ‘prior art,’” or inventions that have come before, the researchers explain. It’s challenging work. Compounding the challenge: patent applicants are permitted by law to create hyphenated words and assign new meaning to existing words to describe their inventions. It’s an opportunity, the researchers explain, for applicants to strategically write their applications in a strategic, ML-targeting way. The U.S. Patent and Trademark Office is generally wise to this. It has invited in ML technology that “reads” the text of applications, with the goal of spotting the most relevant prior art quicker and leading to more accurate decisions.. “Although it is theoretically feasible for ML algorithms to continually learn and correct for ways that patent applicants attempt to manipulate the algorithm, the potential for patent applicants to dynamically update their writing strategies makes it practically impossible to adversarially train an ML algorithm to correct for this behavior,” the researchers write. In its study, the team conducted observational and experimental research.


Cloud Testing — The Future of Software Testing

The cloud-testing life cycle includes the following activities. Test Manager/Project manager/Test leader plays the role of Test admin. Test admin creates the test scenarios and designs test cases. Based on the scenarios and test cases, automated test script will be generated either by test admin or professional tester. Once a cloud service provider is available to test admin, he creates the user to give access to testers. Cloud-service provider set-up the infrastructure. Test users/Testers use the credentials to log in to the portal and can use all the assets available. The Cloud testing process starts here. Testers perform the testing. After completion of the process, cloud testing providers deliver the result. Testing firewalls and load balancers involve the expenditure on hardware, software, and maintenance. In the case of applications where the rate of increase in some users is unpredictable or there is variation in deployment environment depending on client requirements, cloud testing is more effective. Software testing has also undergone a long-drawn evolution cycle. From ad-hoc practices within different business units, it gradually evolved into a centralized Managed Test Center approach. 


Docker servers infected with DDoS malware in extremely rare attacks

"XORDDoS and Kaiji have been known to leverage telnet and SSH for spreading before, so I see Docker as a new vector which increases the potential of the botnet, a green field full of fresh fruit to pick with no immediate competitors," Pascal Geenens, cybersecurity evangelist at Radware, told ZDNet via email earlier this week. "Docker containers will typically provide more resources compared to IoT devices, but they typically run in a more secured environment, and it might be hard to impossible for the container to perform DDoS attacks," Geenens added. "The unique perspective of IoT devices such as routers and IP cameras is that they have unrestricted access to the internet, but typically with less bandwidth and less horsepower compared to containers in a compromised environment," the Radware researcher told ZDNet. "Containers, on the other hand, typically have access to way more resources in terms of memory, CPU, and network, but the network resources might be limited to only one or a few protocols, resulting in a smaller arsenal of DDoS attack vectors supported by those 'super' bots."


CIOs Shift IT Budgets Amid COVID Crisis

What are CIOs today thinking about now as they plan for the rest of the year ahead at this point? How do they assess the effort to move workers from an office environment to work from home? What are some of the other plans they are laying out for the rest of 2020, a year that so far includes a pandemic, civil unrest due to systemic racism, a recession, massive unemployment, and a presidential election? ... "Every industry is grappling with this pandemic in a different way," Nix said. For instance, higher education has had to create a completely different education model for students, and now there's the question of whether they will come back to physical classrooms or stay with remote education, or do some kind of combination of the two, he said. "COVID-19 led to an unprecedented remote work transformation with challenges in productivity and security at scale that had never been anticipated," Nix said. Most CIOs, 77%, said they are reducing their budgets due to the crisis, and 74% said they are prioritizing initiatives that drive operational efficiency. If you want to know where those priorities are right now, just look at some of the challenges that CIOs say their organizations have faced due to the crisis and IT teams enabling the effort to work from home.


The ‘new normal’ – how your IT strategy can enable you to adapt and thrive in a Covid-19 world

Once the technology to improve business service has been considered, it needs to be implemented in order for the business to begin seeing a positive impact. The organisation may have been running an advanced digital transformation programme in concert prior to the pandemic. However, this will now have to be re-assessed against the backdrop of what changes the core business is undergoing in terms of the products and services it provides and how those are procured and consumed by the customer going forward. The sharp switch of retail from high street to online drives a whole wake of impact behind it in terms of web presence, advertising, inventory management, distribution, staffing, brand awareness, manufacturing, transport – and that is just one industry. This obviously puts a different strain on the IT function as new apps and microservices have to be rushed into production and delivered on new platforms, whilst the legacy apps either get parked in a museum corner for now or resources rapidly found and deployed to modify them. There will need to be a big focus on agility as we enter an unknown period when the lockdown begins to loosen.


Why tech didn’t save us from covid-19

There are a lot of different ideas about why the innovation slowdown happened. Perhaps the kinds of inventions that previously transformed the economy—like computers and the internet, or before that the internal-combustion engine—stopped coming along. Or perhaps we just haven’t yet learned how to use the newest technologies, like artificial intelligence, to improve productivity in many sectors. But one likely factor is that governments in many countries have significantly cut investments in technology since the 1980s. Government-funded R&D in the US, says John Van Reenen, an economist at MIT, has dropped from 1.8% of GDP in the mid-1960s, when it was at its peak, to 0.7% now (chart 1). Governments tend to fund high-risk research that companies can’t afford, and it’s out of such research that radical new technologies often arise. The problem with letting private investment alone drive innovation is that the money is skewed toward the most lucrative markets. The biggest practical uses of AI have been to optimize things like web search, ad targeting, speech and face recognition, and retail sales. Pharmaceutical research has largely targeted the search for new blockbuster drugs. 


Safe and smart: IoT deployments in banking

The emergence of IoT and intelligent technologies, including mobile and online banking, is critical to improve customer engagement and make the everyday services clients require to run smoothly. And to stay up-to-speed with a constantly shifting risk landscape and progressing threats, financial institutions must not only plan for today, but also look ahead to ensure the use of the most innovative, yet proven technologies and solutions. As new trends and strategies emerge and take precedence, security leaders should stay prepared and continuously work to gather as much data and intelligence as possible to modernize, simplify, and automate their business. Most financial organizations are looking to leverage technologies to achieve common goals: satisfactory customer engagement, enhanced security, and fraud reduction. Moving forward, banks need to consider how these efforts can be significantly affected by the power of IoT. ... For banks to invest in technology, solutions must allow security teams and investigators to dedicate time and effort to relevant tasks and efficient responses, while leaving certain operations, such as firmware updates and camera verification, up to automation.


Network Resilience: Preparing for Tomorrow’s Challenges and Beyond

Network resilience strategies should be dynamic, and times are changing. It’s not always possible to get onsite all the time. New innovations like IoT are pushing edge deployments with more local processing power at satellite sites. SD-WAN, a now widespread tool, introduces more points of failure via increased software stacks susceptible to buggy firmware updates and other disruptions. Organizations will need to implement smart best practices to ensure network management rises to meet not only the challenges of today, but also tomorrow. But how do we get there? Here are some tips to help get this done. Automation not only eases technician workloads, it also adds security to critical network devices. Some ways it can bolster security include constant event logging with automated analysis and alerts and continuous updates for items like back-up images or firmware update scripts. Automation also makes new site configuration secure, remote and instantaneous with benefits like zero-touch provisioning. Many organizations must constantly update networking functions. 


Corporate Governance in the Era of Offsite Employees

If you don't already have a formal work from home policy, now is the time to develop one. If you already have a work from home policy, you should plan to review it. Once developed or reviewed, work from home policies should be disseminated to employees, so they understand the conditions of working safely and securely from home. An IT work from home policy should minimally mandate strong password selection and no sharing of passwords. The policy should instruct employees about what they should do if their devices are lost or misplaced and inform employees of the methods they should use when they need to transfer or store files. Storing files on local drives at home should be discouraged in favor of storing these assets on the cloud under company management. ... Data encryption and multi-factor authentication should be used if it is necessary to stream or transfer any company-sensitive information or intellectual property. The “catch” with this is that many employees don’t know which information they are working with is intellectual property, so they may inadvertently send information to parties who should not have it.


So You Want to be a Data Manager?

Effective Data Management can reduce errors by using the MDM as the accurate master copy for the organization’s most important information. This helps ensure any applications built using master data are accurate and effective. However, managing data efficiently requires more than MDM. The organization of data needs to line up with the organization’s business strategy and what data the company needs to move forward. The challenge most Data Managers face is how to best use Analytics and how to integrate Analytics with business processes. Integrating Analytics with Data Management will assure a higher degree of success in Analytics projects. When archiving data, a business should use a storage system capable of supporting data discovery, access, and distribution, and when data archiving, regulations and policies must be considered. Data is also subject to quality control, which might involve double-checking manually-entered data through the use of quality level flags designed to indicate potential problems and check format consistency. Additionally, data should be documented, defining its context, content, and parameters.



Quote for the day:

“I’m convinced that about half of what separates successful entrepreneurs from non-successful ones is pure perseverance.” -- Steve Jobs

Daily Tech Digest - June 26, 2020

5 areas IT leaders should be followers

Top executives tend to carry and have access to higher-value data. They also tend to have the most relaxed attitudes toward mobile security, according to MobileIron. Such executives find mobile security protocols frustrating, limiting and confusing. Leadership and authority means that the c-suite has the power to ignore security protocols -- using unsupported devices and apps and skipping multi-factor authentication, to name just a few examples. But this is a mistake, and a common one. Leadership doesn't confer expertise. It simple means that your own personal mobile security tools and practices need to be at least as strong as other employees, or you become the perfect target -- easier to hack and more profitable to breach. ... Of course, every organization has a different calculation to make on budgeting for cybersecurity, taking into account existing infrastructure, number of employees, the nature of the specific industry, the risks business impacts of such spending and deployment. In a world where coronavirus crisis has forced an acceleration of digital transformation, as well as other trends that include remote work. The attack surface of the average organization has suddenly increased. Both digital transformation and remote work increase cyber risk.


Developing a Cloud Migration Framework

The processes that need addressing via cloud adoption may also sit in different departments. A lack of institutional knowledge or documentation also hinders proper assessment of a legacy application, applications suite, or enterprise. These are tidbits your organization may not know off-hand. They are the finding that can become known after a cloud migration team assesses your cloud readiness. Your cloud migration team’s assessment may also find other issues such as inadequate network bandwidth and the over-provisioning of resources. Both problems can contribute to higher costs once your organization is in the cloud. Having a joint solution provider and internal cloud migration team lets you answer the challenging questions about your organization’s actual state of cloud readiness. Team members from your development, operations, and security teams need seats at the table. The full team’s analyses and reports should help turn up where your enterprise excels or lags in cloud support. Those answers come from meetings and interviews with your organization’s business and application owners. 


Digital Transformation: What Can Banks Learn From Other Sectors?

We’ve established that banks can profit by following the example of the big tech companies when it comes to designing the technical architecture and processes around digital transformation. But technology isn’t everything. Successful digital transformation also has a strong human element. To see why this is important, let’s look at a counterexample. Another fintech company that has enjoyed rapid growth is Robinhood Markets, whose mobile app has made it easy for a new generation of investors to start trading stocks, ETFs, options and cryptocurrencies. However, in early March 2020, the Robinhood app suffered a series of systemwide outages that prevented users from opening or closing their positions. The cause of the problems was a technology failure. In a subsequent blog post, the company’s founders noted that their infrastructure couldn’t handle the combination of “highly volatile and historic market conditions; record volume; and record account sign-ups.” But the impact was human. When the app failed, there was no contact centre to act as a backup for booking trades. The result? Many of Robinhood’s small investors were helpless as the markets turned against their positions, or unable to make trades to take advantage of opportunities they spotted during a week when the coronavirus pandemic sparked a mass selloff.


How to future-proof CRM solutions for digital business

Louise Whitcombe, head of customer engagement at Ogilvy UK, explains that CRM isn’t a magic bullet it’s an enabler. “The success of your business depends on your business strategy and being consistently relevant to your customers. It is the business model that drives the specification for the solution not the other way around and this is where the challenge lies. As the speed at which market change increases, so too must our ability as organisations to adapt to it,” she says. Instead, Whitcombe believes that businesses need a “CRM ecosystem that has the ability to cope with the demands of business models that increasingly need to adapt to the changing consumer world around them.” Luckily, she points out that the current tech map shows the marketplace is absolutely teaming with shiny new solutions to businesses CRM challenges. However, she warns that this myriad of opportunity can sometimes feel like a minefield of choice making it tricky to offset and balance CapEx and OpEx costs with the ROI and capabilities of differing solutions. 


7 Tips for Effective Deception

Deception is an interesting and very old concept that has become quite popular over the past few years says Tony Cole, CTO of Attivo Networks. "Deception can work on almost any place in an enterprise where potential compromises can take place," he says, adding it is especially useful where endpoint protection and endpoint detection and response tools may have gaps in protection. "For instance, when an endpoint is comprised and the adversary uses it to query Active Directory, you can provide false information back to the adversary without ever impacting the production environment." Rick Moy, chief marketing officer at Acalvio, points to three main use cases for deception: to add an additional layer of protection in mission-critical environments, to shore up detection capabilities in areas with known security weaknesses, and to lure out adversaries hiding in a sea of security information and event management (SIEM) alerts. "Deploying attractive lures and decoys amid the various network segments works much like the proverbial cheese or peanut butter in a mousetrap that's strategically placed along the kitchen baseboards," Moy says. Here, according to Moy and others, are seven best practices for using deception to detect threats quickly.


Robotics in business: Everything humans need to know

IDC found that spending on robotics hit $135.4 billion in 2019, up from $71 billion two years earlier. According to the report, services such as training, deployment, integration, and consulting will account for $32 billion of that, which accounts for a lot of new jobs. Even the oft-cited PWC report isn't all doom and gloom. Robots increase productivity, and productivity gains tend to generate wealth. Historically, that's led to an increase in service sector jobs, which aren't easy to automate. There are plenty of holes to poke in the methodology of all these reports. And that's the point: An accurate method for predicting how technologies will change the future is elusive -- and that's especially true when the technologies under consideration will fundamentally alter the economic paradigm. In the broad wake of that uncertainty, you have Ray Kurzweil predicting utopia and author Martin Ford predicting something much bleaker. Ultimately, the PWC report comes to what may be the most sensible, albeit frustratingly vague, conclusion. It's not really clear what's going to happen. Average pre-tax incomes should rise with increases in productivity. But the benefits won't be spread evenly across income or education groups.



The Cyberthreat You Didn't Even Know Was Out There

Cybercrime has become today's fastest-growing form of criminal activity. Cybersecurity Ventures predicted that cybercrime will become "more profitable than the global trade of all major illegal drugs combined" and "cost the world $6 trillion annually by 2021." While the majority of attacks continue to be aimed at small to mid-sized businesses with less sophisticated IT infrastructure, organizations that collect massive amounts of sensitive data will always be natural targets and "white whale" prizes for cybercriminals. Chasing such an enticing payday means hackers are willing to launch thousands, maybe even hundreds of thousands, of digital attacks. Only one of them needs to connect in order to unlock scores of lucrative personal information. That means companies that handle our most sensitive data must level up their security beyond that of other high-profile or large organizations in order to ensure this precious data is safeguarded against a constant barrage of threats. The good news is that modern IT infrastructure and security and identity tools are more powerful and sophisticated than ever to stymie malicious access attempts — but only if we are proactive about staying one step ahead of security threats.


European Bank Targeted in Massive Packet-Based DDoS Attack

In the bank incident, the attackers used a packet per second, or PPS, method instead of the more commonly used bits per second, or BPS, method. In the BPS approach, the attacker's goal is to overwhelm the inbound internet pipeline, sending more traffic to a circuit than it's designed to handle, according to the report. Akamai believes the attackers went with a PPS attack to overwhelm the target's DDoS mitigation systems via a high PPS load. A PPS attack is designed to overwhelm a network's gear and applications in the customer's data center or cloud environment, the report notes. A PPS attack exhausts the resources of the gear, rather than the capability of the circuits - as in a BPS attack. "One way to think about the difference in DDoS attack types is to imagine a grocery store checkout," Emmons explains. "A high-bandwidth attack, measured in bps, is like a thousand people showing up in line, each one with a full cart ready to check out. However, a PPS-based attack is more like a million people showing up, each to buy a pack of gum. In both cases, the final result is a service or network that cannot handle the traffic thrown at it."


How enterprises need to rethink business continuity planning

Corporate culture is another important but often overlooked element of business resiliency. Before COVID-19, remote work was already rising in popularity as more digitally savvy millennials and Gen-Zers entered the workforce. But there was still a deeply ingrained preference among corporate leaders to have most workers physically present in corporate facilities. Some believe employees lose creativity and productivity when working from home, analysts say. Others think it's just human nature to slack a bit when not under the watchful eye of management. Neither sentiment is necessarily validated by statistics (the opposite may actually be true). And if enterprises are to going to evolve and enable more remote workers, their cultures will also need to adjust to make way for that, analysts say. "We've always had a lot of societal and cultural resistance to remote work where management just felt that if it didn't see you, it couldn't be confident you were doing your job," says Grossner. "But when COVID-19 hit, guess what? All of a sudden, everyone is working from home, and we find out the model actually can work. A big cultural barrier now seems to be permanently lifting. I don't know if we'll ever go back to that old way of thinking, and future continuity planning should not allow it.


How IT Pros Can Lead the Fight for Data Ethics

A key challenge lies in the many ways IT teams must determine and respond to data ethics within the technical specification of a given system. Examining how data is processed helps to surface the norms at risk. The decision from Amazon, IBM, and Microsoft to halt the availability of their facial recognition AI software to police departments is an example. The decision is partly a response to police brutality protests in the wake of the police killings of George Floyd, Tony McDade, Breonna Taylor, and other Black people across the country. It is also a response to raised questions regarding regulating surveillance tech and negative bias of face recognition involving people of color. So how can IT best lead the ethics fight? Establishing an observability process within given DataOp and AIOps initiatives can help. Observability is a collection of processes to monitor and analyze data within a system. The purpose of observability is to assist developers and operators in understanding issues that appear within distributed systems. Observability reveals critical paths, reducing development time to remove errors and programmatic bugs. The issues associated with those errors and bugs can lead to ethical breaches



Quote for the day:

“Let no feeling of discouragement prey upon you, and in the end, you are sure to succeed.” -- Abraham Lincoln

Daily Tech Digest - June 25, 2020

How Will 5G Networks Get Faster? Densification

The most basic form of densification involves increasing the number of cell towers. Problem is, that’s not really easy, particularly because network carriers are running into challenges with getting approval from local governments and landowners for adding new transmission points. The situation has become so challenging, in fact, that the US FCC recently had to issue a ruling clarifying the rules for 5G network infrastructure deployment. The new ruling essentially limits how much local governments can slow upgrades to existing network infrastructure, such as cell towers. Additionally, most of the early concepts for 5G densification depended on building and installing a lot of small cells—essentially shipping box- or even shoebox-size devices that could be used to enhance the network. The problem is, most of the small cell efforts were targeted towards mmWave, and it’s clear now that those efforts (and the technology overall) are going to take much longer to widely deploy than initially expected. Not only is it difficult to get small cells installed, the costs for the equipment remain high—and the ROI isn’t as clear for many network providers as they first thought.


Silos, Politics and Delivering Software Products

Misalignments between teams can focus on priorities, scope or direction. Imagine that a team finds itself blocked as it is unable to finish a piece of work until work is completed by another team. The other team might not consider this a high priority item for them, in which case there is a misalignment of priorities. Or the other team might consider the work outside of their scope, in which case it needs to be resolved who should deliver the work. Or more seriously it could be a disagreement about direction - the other team might understand the request but consider it a bad request that they do not want to see fulfilled ... A common practice is to dedicate teams to particular features with an intended system. Each team has members with different specialisations and is intended to be able to build a ‘vertical slice’ of functionality that could be delivered to users. For example, the team wouldn’t contain only frontend developers so that they would have to wait for backend developers from another team in order to progress. Teams that provide outputs for other software teams rather than for users are called component teams rather than feature teams.


Office life will never be the same again. Here's what comes next

Research suggests that, prior to the coronavirus outbreak, only about 5% of the UK's 33 million workers worked mainly from home. Despite the regulations, it is relatively easy for employers to refuse a request for home working on one of the prescribed grounds outlined in the legislation, and employees have little opportunity for recourse. As such, presenteeism ruled: workers needed to be seen in the office to ensure they were working. Now, of course, everything has changed. As Wincanton CIO Richard Gifford recognises, the lockdown-enforced shift to remote working is a total reversal of the usual approach in most big companies until now. "Our HR policy was written in a way that previously, if you wanted to work at home, you could, but you'd have to come in and give some good reasons and a decision would be made. Now we're saying, 'you will work at home and you need to give me some good reasons why you need to be in the office'. So, it's a complete turnaround," he says. Gifford has had to maintain a limited on-site presence to manage his firm's on-premise data centre during the outbreak. Yet the vast majority of the firm's 4,500 office-based are working at home – and the result, aided by a solid VPN and a bunch of cloud applications, is likely to be a long-term shift in the perception of remote working.


Three Painful Lessons You Can Avoid with Your APIs and Mobile Apps

If something goes wrong with a website or even an API, you can publish an updated version without the end user even being aware of it. Not so with mobile. If you release a new version, Apple and Google could take hours or days to approve and publish it. Even if you get it fast-tracked, and it is in the App Store hours later, you have no guarantee that the end-user will install the updated version with the fix. Which is why it is absolutely critical to have an API and mobile strategy and follow best practices when designing, developing, and publishing your mobile apps and APIs. ... When projects start going over-budget or over-time, proper testing is often one of the first things that gets cut or reduced. Your APIs and Mobile Apps need. You need a plan for this as well because having a few people randomly using the app IS NOT TESTING! As an enterprise business, you absolutely must have thorough test plans. This needs to be created by an experienced, senior QA Architect. If you are outsourcing your testing, get involved to see who is creating the plan and have a 2nd (or 3rd) set of eyes on the draft and final plans to be sure it is a solid and thorough plan.


Goodbye Xamarin.Forms, Hello MAUI!

MAUI is essentially the next evolution of Xamarin.Forms. It is a framework that will allow us to create native user interfaces for desktop and mobile devices, and the most surprising thing about this is that it has a single code base and a single project. In other words, no more different heads for each mobile OS (iOS and Android)! Alongside MVVM, MAUI will also support The Elm Architecture popularly known as the MVU (Model View Update) design pattern. MVU encourages a code-first development experience that rapidly updates the UI. Microsoft understands the power of the MVU pattern and has introduced a new unified way to build cross-platform native front ends from a single code base. ... With the arrival of MAUI, we will have a single project. We can also choose deployment between different devices or emulators even if we have a single project. But what about application resources like images? The tooling will manage shared sources on each platform as well as the management and creation of images adapted to each platform. ... MAUI is a renewed Xamarin.Forms with similar characteristics but greater features. The structure of Xamarin.Native (Xamarin.iOS and Xamarin.Android) will not change, only the name in .NET 6 will.


Building the Future with 5G and Wi-Fi 6

Based on the Deloitte survey results among executive decision-makers, network metrics don’t seem to be driving the shift as a majority of decision-makers are “satisfied” or “extremely satisfied” with a range of traditional performance characteristics of their current wireless networks, including reliability and resilience, data speed, latency, coverage, location accuracy, energy efficiency, and device density. What is driving the expected shift is that there are signs that organizations are looking beyond traditional network metrics such as reliability and coverage and are instead adopting advanced wireless technologies to hopefully unlock competitive advantage and create new avenues for innovation in their operations and offerings. What the survey stated is that the current technology is often considered to prevent them from addressing the innovative use cases they would like to target. This strong belief in the transformative power of advanced wireless connectivity is especially impressive, considering that both 5G and Wi-Fi 6 are the latest generations of technologies that originated more than 20 years ago and have been evolving ever since.


Study illustrates huge potential of human, artificial intelligence collaboration in medicine

In an experiment created by the study authors, 302 examiners and/or doctors had to assess dermoscopic images of benign and malignant skin changes, both with and without the support of Artificial Intelligence. The AI assessment was provided in three different variants. In the first case, AI showed the examiner the probabilities of all possible diagnoses, in the second case the probability of a malignant change and, in the third case, a selection of similar images with known diagnoses, similar to a Google image search. As a main finding the authors observed that only in the first case did collaboration with AI improve the examiners' diagnostic accuracy, although this was significant, with a 13% increase in correct diagnoses. "Interestingly, less experienced examiners benefit more from AI support than experienced ones. Less experienced examiners trusted AI more than did the experienced ones. The latter only accepted the AI suggestions to change their original diagnosis in cases where they themselves were unsure," the authors wrote. A second experiment showed that all examiners, even acknowledged experts, can be misled by AI, if the output was changed to indicate false diagnoses.


How to Build an API Testing Program with MuleSoft Anypoint

MuleSoft's Anypoint Platform includes a native testing framework (MUnit) that allows Mulesoft experts to conduct unit and API tests on Mule apps. You can also mock APIs to run tests (shift left) before going live. However, MUnit specifically tests Mule flows. The reality is that today's average business transaction involves 35 or more API connections. While MUnit frees developers and engineers to productize APIs easily in Anypoint Studio, MUnit does not extend testing coverage to the APIs that are outside of your Anypoint platform. If solely depending on MUnit for global API quality, your team will not have the clarity to uphold internal and external SLAs for API uptime and performance. The ultimate goal of modern API testing is to ensure that functional, integration, performance, and data-driven tests capture the entire API consumer flow. This is primarily to catch the most common root cause of API problems: human error. ... With human error behind most of your current and future API quality headaches, you must ask whether siloed testing efforts, even if they are bridged by sending test result data to a platform like Elastic, can connect the dots to detect human error.


Effective Governance of Data Requires Understanding of Risk Relevance

Effective governance of data and information risk management require discipline in specification: identifying and organizing the different types of data vulnerabilities, determining the threats that can exploit those weaknesses, understanding the scope of the consequences, and assessing the probability that a threat will take place and--if it does--assessing the probability that there will be consequences. You might think that the best approach would be enumerating the different vulnerabilities and then working from there to consider how those vulnerabilities can be exploited. And, in fact, there are some published guidelines and practices that suggest surveying your organization to assess, describe, and categorize risks as a prelude to developing controls and monitoring for information risk events. Yet, that approach may not be the most practical to take if your information risk management framework is to be aligned with resource allocation and preventative controls. One immediate challenge is that the domain of potential risks is expansive, and one can survey an entire enterprise of data assets and consider the risks before any substantive vulnerabilities are revealed.


Culture of Innovation: Data Management on the IoT Edge

There are a whole host of challenges that building these more complicated architectures create. I think it's definitely one of the top challenges we face.  It depends on how distributed this architecture is, for any given application. But if you're ever depending on these edge devices, which may have keys that allow them to access your corporate corporate network, because they have to be able to send data back to your centralized system, you know security is a huge risk there. These are devices that are out in the field and have less physical security. A use case for one of our customers is that they have computers running our software on every train locomotive in North America. At every switch, and every train crossing are these shacks, at the side of the railroad tracks, hundreds of miles from civilization. Maintenance is an issue and security is an issue because potentially someone could walk up and tamper with these systems. So you need to make sure that the platform is secure from, you know, from the CPU off to avoid any sort of potential security risk.



Quote for the day:

"Managers help people see themselves as they are; Leaders help people to see themselves better than they are." -- Jim Rohn