Daily Tech Digest - April 13, 2020

Banks should be cautious with use of AI in cybersecurity
Financial institutions must be prepared however for cybercriminals’ methods countering new defences with continuing evolving means of their own. Instead of executing cyberattacks with the intention of stealing money or making fraudulent payments, cyber criminals may target the machine learning processes, embedding fraudulent mechanics into the way the AI engines work. “One of the big concerns, especially at the regulatory level for the future, is ultimately the underlying data integrity,” Holt says. “So, if the attackers don’t do big enormous payouts immediately but attempt to alter the underlying data, how would that be spotted?” Therein lies the danger for financial services companies which are overly optimistic about the potentials for AI in cybersecurity. Dries Watteyne, head of SWIFT’s cyber fusion centre, urges caution in this area. “When talking about the potential of machine learning, I think we shouldn’t forget everything we achieved to date without it.”


Windows Subsystem for Linux 2 Moving into General Availability

As discussed previously, WSL2 is a change in architecture from WSL 1. Where WSL 1 required a translation layer between the Linux system calls and the Windows NT kernel, WSL 2 ships with a lightweight VM running a full Linux kernel. This VM runs directly on the Windows Hypervisor layer. This kernel includes full system call compatibility and allows for running apps like Docker and FUSE natively on Linux. With this new implementation, the Linux kernel has full access to the Windows file system. This new release brings large improvements to performance especially for interactions that require accessing the file system. According to Craig Loewan, Program Manager at Microsoft, this could be between a 3 to 6 times performance improvement depending on how file intensive the application is. He further mentions that unzipping tarbars could see a 20 times performance increase. With this upcoming new version of Windows 10, currently known as version 2004, Microsoft has indicated that the installation and updating process of WSL2 will be streamlined.


Zoom vs. Microsoft Teams: Video chat apps for working from home, compared


Teams has a similar feel to Slack -- you can talk to team members privately or in specific channels, and you can call attention to the whole group or just an individual with the mention feature.  You can video chat with up to 250 people at once with Teams, or present live to up to 10,000 people. Share meeting agendas prior to a conference, invite external guests to join a meeting, and access past meeting recordings and notes. Meetings can be scheduled in the Teams app or through Outlook. ... The Zoom video conference app works for Android, iOS, PC and Mac. The app offers a basic free plan that hosts up to 100 participants. There are also options for small and medium business teams ($15-$20 a month per host) and large enterprises for $20 a month per host with a 50-host minimum. You can adjust meeting times, and select multiple hosts. Up to 1,000 users can participate in a single Zoom video call, and 49 videos can appear on the screen at once. The app has HD video and audio capabilities, collaboration tools like simultaneous screen-sharing and co-annotation, and the ability to record meetings and generate transcripts.



Creating a Text-to-speech engine with Google Tesseract and Arm NN on Raspberry Pi

The network’s architecture can be divided into three significant steps. The first one takes the input image and then extracts features using several convolutional layers. These layers partition the input image horizontally. For each partition, these layers determine the set of image column features. The sequence of column features is used in the second step by the recurrent layers. The recurrent neural networks (RNNs) are typically composed of long short-term memory (LTSM) layers. LTSM revolutionized many AI applications, including speech recognition, image captioning, and time-series analysis. OCR models use RNNs to create the so-called character-probability matrix. This matrix specifies the confidence that the given character is found in the specific partition of the input image. Thus, the last step uses this matrix to decode the text from the image. Usually, people use the connectionist temporal classification (CTC) algorithm. The CTC aims at converting the matrix into the word or sequence of words that makes sense.


Collaboration answers the call

tech spotlight collaboration ctw intro by  ipopba 887088424 3x2 2400x1600
Senior Reporter Matthew Finnegan, who covers collaboration for Computerworld, addresses the question in the back of everyone's mind: "Remote working, now and forevermore?" Surveys show that the majority of people prefer to work from home — and in organizations that have had mature work-from-home policies for a while, many employees have settled into their new reality as if it's no big deal. The office won't go away overnight. But as long as productivity endures, and as collaboration tools inevitably improve, why not allow people to work wherever they like? Matthew and IDG TechTalk's Juliet Beauchamp discuss these and other possibilities on a special episode of Today in Tech. One thing's for sure: Videoconferencing is proving itself the lifeblood of remote work. But can networks handle it? By all accounts, the public internet and even cloud services have held up remarkably well. Yet as analyst Zeus Kerravala observes in "Videoconferencing quick fixes need a rethink when the pandemic is over," written by Network World contributor Sharon Gaudin, those who return to the office and want to continue Zooming or Webexing could face obstacles.


Why Industries Should Prepare For Mass Blockchain Adoption

Photo:
First and foremost, the token market is likely to be significantly reduced this year, and only the most highly demanded and well-developed projects will remain as the digital assets traded on exchanges that are increasingly being forced to comply with legal requirements. Another change this year will be a gradual transition to turnkey solutions. The idea of blockchain turnkey solutions was first presented by Bitwings, an official blockchain-based solution of the leading Spanish mobile operator Wings Mobile. Its goal was to create the most secure standards for e-devices without compromising the operating system and its performance. To integrate turnkey solutions, companies need to conduct internal research: analysis of the current market and existing problems, and the potential of the blockchain in different sectors. It’s also worth studying the existing centralized and decentralized solutions and deciding how to integrate the solution into production processes without disrupting their performance. The latter is the most important point; it is one that all executive officers should pay attention to. They must consider the most efficient options for integrating blockchain into their working processes.


DDR5 memory promises a significant speed boost

Samsung DRAM
"What's important about DDR5 is the high level of integration it provides," says Jim Handy, principal analyst with Objective Analysis, an analyst firm specializing in the memory market. "The people who defined this spec took advantage of the fact that Moore's Law not only reduces DRAM's price per bit, but it also makes it cheaper to add increasing amounts of powerful logic to the chip. They have artfully used this to improve the CPU-DRAM bandwidth, to move the Memory Wall a little farther out." The Same Bank Refresh is a good example, Handy says. "For DRAM's entire history a chip couldn't provide data while it was being refreshed. Now Same Bank Refresh allows data to be accessed in banks that aren't undergoing refresh. This does a lot to improve data communication." So when will this start to show up? Last year an Intel roadmap was leaked to the hobbyist press that showed Intel was planning to move to DDR5 and PCI Express 5 (completely skipping PCIe v4) in 2021. Micron has begun sampling DDR5, Hynix said it plans to begin volume production at the end of this year, and Samsung plans to start DDR5 production next year.


Don’t Leave “Ethical Tech” Out of Your Digital Transformation Plan

Few organizations and their leaders develop an overall approach to the ethical impacts of technology use—at least not at the start of a digital transformation. In a recent study, just 35 percent of respondents said their organization’s leaders spend enough time thinking about and communicating the impact of digital initiatives on society. But in order to be truly savvy in the age of advanced, connected, and autonomous technologies, leaders must think beyond designing and implementing technologically driven capabilities. They should consider how to do so responsibly from the start. At Deloitte, we see a relationship between a company’s digital and technological progress—in other words, its tech savviness—and its focus on various ethical issues related to technology. Our research found that 57 percent of respondents from organizations considered to be “digitally maturing” say their organization’s leaders spend adequate time thinking about and communicating digital initiatives’ societal impact, compared with only 16 percent of respondents from companies in the early stages of their digital transformation.


Duplication, fragmentation hamper interoperability efforts, impact patient safety


Duplicate records might also contain incomplete or outdated information and can affect the quality of care by forcing clinicians to make care decisions without important information such as recent lab results, allergies and current medications. Back in 2019, Verato and AdVault partnered on a cloud-based patient matching platform which aims to expand secure identity matching so care teams have seamless access to medical records. Patient matching specialist Verato, which has also partnered with healthcare IT security specialist Imprivata, is of the belief that alignment of disparate patient record platforms will help eliminate duplicate records, establish more accurate care histories and improve patient safety. In a 2016 Ponemon Institute survey, 86% of respondents said they witnessed a medical error as a direct result of misidentification, and indicated that 35% of all denied claims are due to misidentification, which can cost hospitals up to $1.2 million a year. "Many systems still do not communicate and store data in disjointed architectures and an upsurge of identifiers continues to be created," Doug Brown, managing partner of Black Book, said in a statement.


COBOL, COVID-19, and Coping with Legacy Tech Debt

Image: Alexander - stock.adobe.com
With a history that stretches back three generations, COBOL was developed for a different breed of compute, Edenfield says. “These were massive machines that did certain things like number crunching,” he says. “It wasn’t fancy.” COBOL was designed to move across multiple machines and frankly to be readable, Edenfield says. “People could learn it quickly and it was easier than an assembly language where you are programming in very cryptic commands.” As new compute demands emerged, programming languages evolved, Edenfield says. Agile development and other modern processes can be more efficient and fundamentally different from how COBOL and other early programming languages were handled. Despite such advances, it is a challenge to escape those legacy roots. “Because COBOL was so prevalent, they can’t get out of it,” he says. “There’s so much of it. It’s running all the backroom, payment processing for all your major financial institutions; all your big companies have it.” It was common for organizations to constantly build up COBOL-based systems for decades, Edenfield says, with the programmers retiring or moving on. “Pretty soon, the people who wrote the systems aren’t there anymore,” he says.



Quote for the day:


"Many people go fishing all of their lives without knowing it is not fish they are after." -- Henry David Thoreau


Daily Tech Digest - April 12, 2020

AI (Artificial Intelligence) Projects: Where To Start?

GUI (Graphical User Interface) concept.
You don’t want to spend time and money on a project and then realize there are legal or compliance restrictions. This could easily mean having to abandon the effort. “First, customer data should not be used without permission,” said Debu Chatterjee, who is the senior director of platform AI engineering at ServiceNow. “Secondly, bias from data should be mitigated. Any model which is a black box and cannot be tested through APIs for bias should be avoided. The risk of bias is present in nearly any AI model, even in an algorithmic decision, regardless of whether the algorithm was learned from data or written by humans.” In the early phases of an AI project, there should be lots of brainstorming. This should also involve a cross-section of people in the organization, which will help with buy-in. The goal is to identify a business problem to be solved. “For many companies, the problem is that they start with a need for technology, and not with an actual business need,” said Colin Priest, who is the VP of AI Strategy at DataRobot. “It reminds me of this famous quote from Steve Jobs, ‘You’ve got to start with the customer experience and work backwards to the technology. You can’t start with the technology and try to figure out where you’re going to sell it.’”


How to Reduce Remote Work Security Risks

istock 876819100
Employees should remain cautious of downloading random applications or software to avoid malware, viruses, or insecure protocols. If they’re unsure, they should check with IT support or their Security team. Also, remind remote workers to be careful when sharing confidential data. They should use company-issued apps for file sharing, storage of confidential documents, and communication. Let them know this is for their own safety, too, that the company has protective measures around these apps and can monitor for suspicious behavior. Consistently communicate with your employees. Ultimately, keeping everyone informed on how to secure their home technologies and practice security in their everyday lives trumps technologies. Maintain communication in a variety of communication channels, to keep them up-to-date on the latest security threats and how to reduce their risk to their personal, and company information. Make sure your security and IT experts are household names, available for questions and sharing red flags.


Automated Machine Learning Is The Future Of Data Science

Data Science
The objective of autoML is to abbreviate the pattern of trial and error and experimentation. It burns through an enormous number of models and the hyperparameters used to design those models to decide the best model available for the data introduced. This is a dull and tedious activity for any human data scientist, regardless of whether the individual in question is exceptionally talented. AutoML platforms can play out this dreary task all the more rapidly and thoroughly to arrive at a solution faster and effectively.A definitive estimation of the autoML tools isn’t to supplant data scientists however to offload their routine work and streamline their procedure to free them and their teams to concentrate their energy and consideration on different parts of the procedure that require a more significant level of reasoning and creativity. As their needs change, it is significant for data scientists to comprehend the full life cycle so they can move their energy to higher-value tasks and sharpen their abilities to additionally hoist their value to their companies.



How Hyperscale Storage Is Becoming More Accessible

It is a scale-out solution that enables you to scale compute and storage independently. And it's through software-defined storage. So you can pick any client, any server, any network, we can run on a quanta server, HP Dell, we can run with Intel CPU, on AMD, or even on arm. There are two main components that I want to touch on. The first one is the NVMe over TCP. This is basically a standard that we invented together with Facebook, Dell, Intel, and a few others. Today, the standard is already fully ratified. What we have here is a super optimized TCP stack userspace that combined together with the NVMe stack, and gives us the ability to support in a very large data center, thousands of connection thousands of containers in millimeter or virtual environment. The second very important layer is the global FTL. FTL is a flash translation layer. That's the layer you can find in every SSD. It's a very high level during the translation between the logical a transaction A to the storage system to the physical transaction to the flesh, what we have done in lightweights.


COVID-19 is accelerating CI/CD adoption

COVID-19 is accelerating CI/CD adoption
As it turns out, the stakes are much higher given the now pervasive work-from-home arrangements most organizations now embrace. Talking with Rose in a phone interview, he stressed that even after years of DevOps discussion, “You still have a lot of companies that are doing most of their software testing on-prem and behind the firewall. The big installed base remains Jenkins in a proprietary data center.” This wasn’t ideal but it was workable when developers and operations professionals worked in an office environment, within the firewall. In a remote-only situation, getting access to the application development workflow is “tricky,” he stresses, because, in part, there’s no guarantee that you’ll be able to VPN in. And so companies are moving much faster than planned from private data centers to public clouds, in an effort to move workloads to a place where modern CI/CD can happen. “All the timelines have shrunk,” Rose says. Over the last two years companies have realized they need to move faster, but perhaps still struggled to start moving. “Now every company is trying to get apps to be cloud-enabled or cloud-native,” he stresses.


Zoom Promises Geo-Fencing, Encryption Overhaul for Meetings


In response to Citizen Lab's report, Zoom immediately promised to implement geo-fencing to ensure that no keys get routed via China, except for China-based users. Yuan attributed the routing of keys via China to a development error as the company attempted to rapidly scale up to meet a surge of demand, starting in China, where the COVID-19 outbreak began, leading the company to allow much greater, free access to its tool, in part, to support medical professionals. (Free versions typically otherwise have a 40-minute time limit for meetings.) "In February, Zoom rapidly added capacity to our Chinese region to handle a massive increase in demand," Yuan says. "In our haste, we mistakenly added our two Chinese data centers to a lengthy whitelist of backup bridges, potentially enabling non-Chinese clients to - under extremely limited circumstances - connect to them (namely when the primary non-Chinese servers were unavailable). This configuration change was made in February." He says Zoom fixed this problem immediately after learning of it via Citizen Lab. "We have also been working on improving our encryption and will be working with experts to ensure we are following best practices," Yuan says.


DevOps proponent lays it on the line: stop the madness and start automating


The final three steps is where many development teams tend to stumble, Davis says. "The most blissful thing about writing code or doing a complex admin task and so forth is when you get everything in your head, and you can see how everything fits together, and the world disappears, and you know exactly how your org works, and anybody could ask for any change and you can fix things. Developers live for that blissful feeling -- to know everything and fix anything." The catch is, a particular project ends, distractions distract, new projects begin, and time passes, Davis continues. "That disappears out of your working memory right? There may be a day, or a week, or a month delay before you know that you broke something. By the time three weeks has elapsed, you forgot that you even built that thing. And if you remember that you built it, you forget how you built it, you forget exactly why you built it. You can make another change of course, but then it might take you another three weeks until you can get that back to your users." Multiply this by hundreds or even thousands of change requests within a large organization, and it's easy to see how things can go awry. DevOps brings order and flow to this potential madness, and Davis boils it down to a three-step process: development, innovation delivery, and operations.


New machine learning method could supercharge battery development for electric vehicles

New machine learning method could supercharge battery development for electric vehicles
"Computers are far better than us at figuring out when to explore—try new and different approaches—and when to exploit, or zero in, on the most promising ones." The team used this power to their advantage in two key ways. First, they used it to reduce the time per cycling experiment. In a previous study, the researchers found that instead of charging and recharging every battery until it failed—the usual way of testing a battery's lifetime -they could predict how long a battery would last after only its first 100 charging cycles. This is because the machine learning system, after being trained on a few batteries cycled to failure, could find patterns in the early data that presaged how long a battery would last. Second, machine learning reduced the number of methods they had to test. Instead of testing every possible charging method equally, or relying on intuition, the computer learned from its experiences to quickly find the best protocols to test. By testing fewer methods for fewer cycles, the study's authors quickly found an optimal ultra-fast-charging protocol for their battery.


How Big Data and IoT Are Connected


Sensors upon sensors will crop up in all sorts of technologies if they aren’t already. Gigabytes and terabytes of information will whizz between devices at a frightening speed and big data technologies will work even harder to store, process and take value from the collected yet often unstructured sensory information. End-points from numerous locations will knowingly unlock an almost unlimited amount of data, what happens to that data will be considered by those who work in the IoT and big data industries. The result of this interaction will create two likely winners. Firstly, the businesses that can profit from the information provided, and the end-user who has better information to act on. Ultimately, businesses that are seeking to implement IoT into their products are also seeking greater profits, more productivity, higher efficiency and reduced costs. The development of big data technologies works in favor of IoT companies, with both seeking to strategize the ways in which we see and utilize data sets. As for the customer or end-user, they will (if they aren’t already) benefit from the provision of greater useful information, as well as improved customer service and experiences.


Fotolia_131189299_S Sergey Tarasov
In a related twist, customers will, with no surprise, first call their ISPs whenever there is any connectivity problem. In order to provide service, that means a larger call staff. However, what if the problem is a specific device? Even more complex, what if it’s a specific application being run on the phone? An ISP which can quickly identify the root cause of the issue can either fix its own issues or point the customer towards the appropriate firm to provide service. Doing that efficiently will save enormous amounts of money. Identifying technical issues is a clear use case for AI. The question that needs to be answered is how close to the devices can an AI system run. On the ISP’s services, there’s a distance that can obscure some issues. It would be much better to run AI on an individual home’s modem or, even better, a router. The question becomes the footprint. Even runtime AI has not been known for highly efficient resource usage, and many companies have been working to address that for many IoT applications. One such company addressing the issue for the connected home is Veego. They claim to have AI inference that runs on home routers and modems in order to identify performance issues.



Quote for the day:


"As a leader, you set the tone for your entire team. If you have a positive attitude, your team will achieve much more." -- Colin Powell


Daily Tech Digest - April 11, 2020

Expressing The BIAN® Reference Model For The Banking Industry In The Archimate® Modeling Language


The expression of the BIAN model in ArchiMate has been a joint effort by BIAN and The Open Group, the stewards of the ArchiMate standard. The full details of this mapping can be found in the document “ArchiMate® Modeling Notation for the Financial Industry Reference Model: Banking Industry Architecture Network (BIAN)” published by The Open Group. To explain the use of BIAN in the ArchiMate language, The Open Group has published a case study whitepaper co-authored by one of us (Patrick), which uses the fictitious but realistic Archi Banking Group as an example. In this blog, we want to give you an impression of what this is about, picking and choosing some of the juiciest bits. For the full case study, please refer to the whitepaper. Archi Banking Group is the result of the acquisition of several banks in different countries, as most international banks are nowadays. This has come with the typical challenges of integration and cost control. In particular its fragmented information is becoming a compliance risk and the challenges of ‘open banking’ (e.g. PSD2) are difficult to meet. 



Development Versus QA: Ending the Battle Once and for All


The reason why minimizing blame is the number one priority for QA engineers is that in the QA realm, there is a general acceptance that bugs are always going to make it to production, no matter what. This is something we expect because a 100% guaranteed bug-free product would take years to ship rather than weeks, and would therefore be economically unviable. Since they know there will be problems to deal with no matter what they do, they want to show that they did everything in their power to prevent those problems. Naturally, they want to write as many tests as possible to minimize the risk of bugs that they should have caught. But since it’s impossible to write an infinite amount of tests, they have to prioritize what to test for. A QA team is given no data by which to prioritize what to test, so this prioritization is essentially a guessing game. It may be an educated guessing game based on experience and expertise, but it’s still predicting what users are most likely to do on an application without objective data as to what they really care about and how they really will use the application.



Microsoft Teams Promises Great Video Calls: No More Typing Or Dog Noises

In this photo illustration a Microsoft teams logo is seen...
As reported by Venture Beat, Microsoft has promised AI-enhanced innovations which will be able to suppress background noise – in real time – so your call can continue smoothly. Instead of merely reducing the impact that an air conditioning unit has on the call, Teams will aim to suppress other noises not normally covered, such as doors slamming, over-excited typing on a computer keyboard or my beloved pooch having an inconvenient moment. The keyboard is a case in point. If you’re taking notes during an interview, you ideally don’t want that clickety-clack noise to intrude on the conversation. It’s those noises which aren’t “stationary” as Microsoft says, that are hard to suppress without AI. It takes hundreds of hours of data to work out what’s desirable and what’s not, using audio books to represent voices and then other sources to create those pesky noises. All of which leads to the creation of neural network to start the AI working on the data to sort out what should be heard and what shouldn’t. The power of the cloud can be leveraged to help, providing fast, real-time analysis of what’s going on and deciding what should be heard by the person at the other end of the call and what shouldn’t.



Scientists develop AI that can turn brain activity into text

The system was not perfect. Among its mistakes, “Those musicians harmonise marvellously” was decoded as “The spinach was a famous singer”, and “A roll of wire lay near the wall” became “Will robin wear a yellow lily”. However, the team found the accuracy of the new system was far higher than previous approaches. While accuracy varied from person to person, for one participant just 3% of each sentence on average needed correcting – higher than the word error rate of 5% for professional human transcribers. But, the team stress, unlike the latter, the algorithm only handles a small number of sentences. “If you try to go outside the [50 sentences used] the decoding gets much worse,” said Makin, adding that the system is likely relying on a combination of learning particular sentences, identifying words from brain activity, and recognising general patterns in English. The team also found that training the algorithm on one participant’s data meant less training data was needed from the final user – something that could make training less onerous for patients.


IBM, Open Mainframe Project launch initiative to help train COBOL coders


Despite its age, COBOL is reliable and is still widely used -- there's an estimated 220 billion lines of COBOL still in use today. IBM, one of the founding organizations behind COBOL, continues to offer mainframes compatible with the language. The issue with COBOL now is that there are few programmers left with the skills to maintain legacy COBOL applications. Specifically, state agencies are struggling to find actively working COBOL engineers who can update their unemployment benefit systems to factor in new parameters for unemployment eligibility. To address this skills gap, IBM and Linux Foundation's Open Mainframe Project have launched a new program to help connect states with programmers who have COBOL language skills that are proving key in the push to manage the surging number of unemployment claims nationwide. ... "We've seen customers need to scale their systems to handle the increase in demand and IBM has been actively working with clients to manage those applications," said Meredith Stowell, VP of IBM Z Ecosystem. "There are also some states that are in need of additional programming skills to make changes to COBOL.


World Economic Forum explores blockchain interoperability

blockchain interoperability
Blockchain interoperability is often viewed as a technical challenge, but there’s a lot more to it than that. The WEF divides into the Business, the Platform, and the Infrastructure.  The business aspect encompasses the governance of the blockchain and trust between the two networks, as well as data standardization. To share data, it has to be standardized. But often this homogeneity is focused within a single network as opposed to across networks. Other business aspects include incentives and the legal framework, which can be a bigger challenge across jurisdictions. The platform refers to the blockchain protocol, consensus mechanism, smart contract languages, and how users are authorized and permissioned. And the infrastructure looks at the hosting of servers in hybrid clouds, managed blockchains, and whether there are potentially proprietary components that might hinder interoperability. Different projects that implement interoperability are explored, mostly for public blockchains, include the well-known projects Cosmos and Polkadot. For enterprise blockchain, the WEF referred to Hyperledger Quilt, the open source implementation of Ripple’s Interledger, as well as the Corda Settler.


Cybersecurity officials say state-backed hackers taking advantage of pandemic

Silhouettes of laptop users are seen next to a screen projection of binary code are seen in this picture illustration taken March 28, 2018.
“Bad actors are using these difficult times to exploit and take advantage of the public and business,” Bryan Ware, CISA’s assistant director for cybersecurity, said in a statement. The agencies warned that hackers were also exploiting growing demand for work-from-home solutions by passing off their malicious tools as remote collaboration software produced by Zoom and Microsoft. Hackers are also targeting the virtual private networks that are allowing an increasing number of employees to connect to their offices, the agencies said. ... “Crowdsourced security platforms are built to simultaneously enable a remote workforce and help organizations maximize their security resources while benefiting from the intelligence and insights of a ‘crowd’ of security researchers,” Bugcrowd CEO Ashish Gupta told VentureBeat. “In the current environment, a lot of companies don’t have the required resources to secure and test remote environments where the majority of business is now taking place.”


AIoT and Intelligence on the Edge


Edge intelligence allows a high level of data to be processed and analyzed, and for decisions to be made locally, without being sent to the cloud. Take for example a self-navigating drone, instead of relying on a service hosted on the cloud to tell the drone where to go next, the drone itself is now able to decide its own path in the field, even when connections to cloud hosted services are not reliable. ... For architects and program leads working on such initiatives within the company, it’s mainly a mindset change in regards to how the solution is designed, including capabilities of the devices on the edge and where the decision-making step in a process happens. Feasibility for scenarios such as the drone automatically calculating its own path instead of relying on a cloud-hosted service are now better than before, and a few demos or proof-of -concept attempts could now move many of these stories from the backlog and bring implementation dates forward. While AIoT in its re-imagined, converged form may be new, the two original fields (AI and IoT) that merged to create AIoT are both mature and well into mainstream adoption. 


What do CISOs want from cybersecurity vendors right now?

CISOs cybersecurity vendors
To companies providing cybersecurity solutions, the polled executives advised to avoid sales pitches that involve fear-mongering, to dial down cold calls and emails, and to concentrate on nurturing existing relationships. “Messaging ought to be geared towards impacting an enterprise’s bottom line or community, rather than attempting to fearmonger or stoke panic over a situation already causing CISOs enough anxiety,” YL Ventures explained. “Cybersecurity executives feel quite unanimously about the marketing frenzy and, according to our sources, are compiling a ‘black list’ of vendors guilty of using this tactic.” Companies should concentrate on discovering what they can do to help their existing customers and discussing their customers’ experiences. Not only will this improve customer relations, but also provide helpful information that can inform the vendor’s future plans. Last but not least, vendors should consider making goodwill gestures. “Profiteering off of a world-wide tragedy will do vendors little service in the eyes of prospective customers. 41% of the CISOs we consulted with praised technology companies using their services to help other businesses and advised entrepreneurs to follow in their lead instead,” YL Ventures noted.


Why architecting an enterprise should not be IT-centric


The first and most important reason that architecture should not be IT-centric is the same reason why more and more IT-functions are merged with ‘business functions’. A popular metaphor was (is?) that information should be like water coming out of a faucet. In that metaphor the IT department is responsible for developing IT to deliver the information need from the ‘business’. The business aks for ‘information provisioning’, the IT department delivers. This ‘what — how’ division has been the reason for non-functioning business / IT cooperation in lots of organisations in the past decades. An enterprise in general does not need ‘information’ as such, but it needs resources and technology to execute business processes. The type of technology is not very important from a business perspective. It could be humans doing the job, mechanic or digital technology and mostly it will be a mesh of all these types of technology. As a side remark. Yes, data as a source for doing data intelligence could be seen as a product delivered by an organisational department, but that is only a small part of the totality of digital technology.



Quote for the day:


"Conviction is worthless unless it is converted into conduct." -- Thomas Carlyle


Daily Tech Digest - April 10, 2020

WiFi for Enterprise IoT: Why You Shouldn’t Use It


It’s the job of the local IT team to make sure their enterprise’s IT infrastructure is secure and reliable. Connecting dozens, hundreds, or even thousands of devices to that IT infrastructure poses a high risk to both security and reliability while offering little upside to the IT team. It may be true that your IoT solution will generate immense value for the enterprise to which you’re deploying, but this value is often not to the IT team directly. The local IT team will have other internal requests on their plate, and providing you support so you can deploy your IoT solution will likely be low on their list of priorities. This means that the stakeholders who you need most, due to their understanding of and control over the local WiFi setup, are least incentivized to help you. Let me be clear, I’m not attacking IT teams generally, but I’m pointing out the inherent misalignment of incentives even with the most capable and well-meaning IT teams. ... The lack of end-to-end control means that the success or failure of your IoT solution doesn’t rest solely within your hands. Customers don’t care why their shiny new IoT solution isn’t working and that it’s not your fault, they just care that it isn’t working.



10 Ways to Spot a Security Fraud

The Latin phrase "caveat emptor" has become an English proverb, and for good reason. "Let the buyer beware" is an axiom that nearly all of us are familiar with. Most of us know the phrase in the context of retail purchases. We were taught, or have learned over time, to never take sellers at their word. We must always perform the appropriate research before making a purchase. In security, unfortunately, we must practice a different type of caveat emptor. In recent years, security has become a hot field. And sadly, where there is budget and focus, there are also frauds and deceivers. There is no shortage of people presenting themselves as security experts. Some of them truly are. The rest of them, however, are keen to take advantage of security professionals who haven't yet learned to filter the real security experts from the fakes. ... Honest, hard-working security professionals have no problem emailing or otherwise putting agreements into writing. It's very common for a meeting to result in a follow-on email with minutes and action items.


The CSI Effect Comes to Cybersecurity


The problem is that forensic science is often portrayed as providing definite and irrefutable evidence of proof when the truth is that, outside of DNA analysis, forensic science should only be used as supplementary weight to support an allegation. In reality, forensic science is used relatively sparingly, especially when eye-witness, circumstantial and alibi evidence is available. Its comparatively expensive, time-consuming and rarely the definitive evidence that TV suggests. When it comes to cybersecurity investigations, instead of swabs, fingerprints and fibers, a key source of evidence are system logs. Everything from applications to devices is capable of generating an audit trail, ‘logging’ activities and events. At its simplest, if we have a record of logons to a system, and we know when our breach happened, we have a cyber ‘smoking gun’. If we can use log data for a reconstruction post-attack, why can’t log events be used to pre-empt a breach, providing an early warning that suspicious activity is taking place? This is the promise of contemporary SIEM technology, an automated system to capture sufficient evidence to not just understand the timeline of a breach, but to detect the warning signs of an attack before it happens.


Security-by-Design Principles Are Vital in Crisis Mode

Cybersecurity
As organizations move to expand remote working and automation capabilities during the crisis, they are more likely to make mistakes. “You can’t let either the technology or the new business processes outpace the security behind it. You need to ensure that your internal security team is a part of every decision you make regarding new technology, processes or ways of working.” Experts recommend making security a consideration at the earliest possible stage when planning on technology deployments. “Make sure you bring in the stakeholders, the business as well as the operators into security discussions,” recommended Bob Martin, co-chair of the Software Trustworthiness Task Group at Industrial Internet Consortium. “You need to consider [security] as one of the primary aspects of any solution and, like the foundations of a house, everything else is built on top of that,” said Andrew Jamieson, director, security and technology at UL. Organizations that neglect to build a correct foundation risk rebuilding it or “at least spend a great deal of time and effort fixing something that could have been much more easily remedied earlier on,” Jamieson said.


CD Foundation Serves Up Tekton Pipelines Beta

CD Foundation
The beta release of Tekton Pipelines is significant because it signals that the project is now stable enough to be incorporated in DevOps platforms and from here on will follow the same deprecation policies as Kubernetes in terms of supporting previous releases. However, Wilson noted that Tekton Triggers, Tekton Dashboard, Tekton Pipelines CLI and other components are still alpha and as such may evolve from release to release in a way that is not necessarily backward-compatible just yet. In the meantime, the Tekton Pipeline team is encouraging all Tekton projects and users to migrate their integrations to the latest version of Custom Resource Definition (CRD), which is the application programming interface (API) supplied. The Tekton Pipeline team is also making available a migration guide. The Tekton Pipelines project is one of several initiatives being advanced under the guidance of the CD Foundation, which is an arm of The Linux Foundation. Other projects include Jenkins and Jenkins X, a pair of open source CI/CD projects developed originally by CloudBees and Spinnaker, a CD platform originally created by Netflix.


ARming a new industry: Manufacturing can fully realise the potential of AR


AR is a frontrunner to help minimise machine downtime and streamline the supply chain process. For instance, when engineers need to communicate with off-site experts to maintain machinery, on-screen 3D annotations can be used to direct less experienced technicians. This is a crucial aspect of AR as it can help to address any skill gap deficits being experienced. Being able to access the knowledge of an expert technician to support in-house or field technicians decreases the amount of time needed to repair machines and get them back up and running. The technology is also being used as an invaluable training tool, allowing manufacturers to assess and maintain more stringent levels of quality control, as well as developing talented engineers. Furthermore, AR can help in more recent developments such as the proactive maintenance process. Using advanced analytics, manufacturers can identify potential errors and use remote experts and AR annotated displays to guide on-the-ground workers to fix problems before they become a major threat to the manufacturing line.


Zoom, Netflix discuss remote network management challenges


Application performance problems are typically not network problems and deal more with UX. As more employees work from home, IT teams may assume UX issues stem from the organization's network rather than the user's application performance. These issues may also cause network engineers to doubt their skill sets in this unfamiliar territory, Viavi said. However, if a business aims to operate as usual -- even in an unusual time -- then network engineers should likewise go about network issues and remote network management as usual. This means conducting packet analysis and other standard troubleshooting techniques to determine whether an issue stems from the business network or from a user's application or network connection. Netflix's Temkin said his team faced occasional strain in last-mile connections, as did Dzmitry Markovich, senior director of engineering at Dropbox.


What is artificial narrow intelligence (ANI)?

artificial intelligence under construction
Narrow AI systems are good at performing a single task, or a limited range of tasks. In many cases, they even outperform humans in their specific domains. But as soon as they are presented with a situation that falls outside their problem space, they fail. They also can’t transfer their knowledge from one field to another. For instance, a bot developed by the Google-owned AI research lab DeepMind can play the popular real-time strategy game StarCraft 2 at championship level. But the same AI will not be able to play another RTS game such as Warcraft or Command & Conquer. While narrow AI fails at tasks that require human-level intelligence, it has proven its usefulness and found its way into many applications. Your Google Search queries are answered by narrow AI algorithms. A narrow AI system makes your video recommendations in YouTube and Netflix, and curates your Weekly Discovery playlist in Spotify. Alexa and Siri, which have become a staple of many people’s lives, are powered by narrow AI. In fact, in most cases that you hear about a company that “uses AI to solve problem X” or read about AI in the news, it’s about artificial narrow intelligence.


Identity as the New Perimeter


“The question becomes, what happens after the employee connects to your network? Do you have a way to trace the access that that employee is obtaining? Do you have a way to validate if those are legitimate access requests or if something malicious is taking off?  “What we see today is that many organizations rely only on perimeter security. What Siverfort does is enable you to extend your multi-factor authentication beyond the perimeter to any access, whether it’s on-premise or whether it’s in the cloud. No matter the application, whether it is a homegrown application or an IoT device.” So, why are too many sensitive systems still not using MFA? Traditional MFA solutions are difficult to deploy. They require software agents or proxies. They often require a custom integration with legacy systems. Our work environments and IT infrastructures have evolved. Our world is changing at breakneck speed. New ways of looking at security are needed.


What Is The Hiring Process Of Data Scientists At IBM?

IBM
The technical skills that IBM looks for in data science candidates encompasses ML Ops, which includes some of the newer skills, like debiasing and machine learning model runtime management.  “In addition to that, they need to possess adequate skills in the areas of Data ops, data wrangling and domain knowledge, which is essentially a cross section between industry knowledge and applicability of machine learning in those industries,” says Chahal. Although the company does not overemphasize candidates’ educational background, they need to have a good grasp of the relevant competencies mentioned above. With several platforms abound with machine learning certifications, Chahal feels that that may be a good approach for data science aspirants to upskill themselves. “These certifications can verify their awareness about various platforms, tools, libraries and packages that are being used across enterprises today, as well as the familiarity or the ability to work with open source or enterprise/vendor-specific tools.”



Quote for the day:


"Leadership is absolutely about inspiring action, but it is also about guarding against mis-action." -- Simon Sinek


Daily Tech Digest - Apr 09, 2020

Let’s make testing Agile, they said. Uh, what did they mean by that?

Let’s make testing Agile, they said. Uh, what did they mean by that?
Automated software testing is a fundamental part of Agile software development, even though it is not included in the manifesto. Automated testing helps in many ways, say Okken. But in general, a robust test suite helps ensure working software, increases a team’s ability to refactor and extend a software system, and respects individuals by automating the generally boring task of manual regression testing. “Automated tests also speed up development, further respecting the time of software developers, and allowing faster and more frequent deliveries to end users,” Okken says. “The development of automated tests during production code development helps developers understand the problem domain, the API, the problem at hand better, and help them in turn develop better software. Why would anyone want that learning to go to a separate team and not to the development team?” In adopting DevOps, you are discarding the traditional method of development, commonly called “waterfall,” for the more iterative process of building a small amount and testing rigorously we know as Agile.


Project Orleans and the distributed database future with Dr. Philip Bernstein

The set of mechanisms that we use to solve database problems, they don’t change very fast. Back in the early days, we were learning about certain base technologies for the first time, but now, there’s this repertoire of ingredients that you put into solving a database problem. I’m very sympathetic to graduate students who are trying to learn this stuff because, you know, I learned it slowly over a period of many years as it was unfolding, but people getting into the field, they learn it in a very compressed amount of time and they don’t necessarily have a deep understanding of why things are the way they are and so when they encounter a problem, they’re trying to solve it just based on an understanding of the problem and then trip over some approach that they think, oh, I’ll bet that would be helpful, but then they don’t realize this is actually a variation on something that has been applied in several other contexts before.


New botnet attack "puts other IoT botnets to shame"

malware in a computer system
A destructive new botnet that compromises vulnerable Internet of Things (IoT) devices and hijacks their resources to carry out devastating Distributed Denial of Service (DDoS) attacks is being reported by security research firm Bitdefender. The IoT botnet, which the company named "dark_nexus," has recently been found in the wild and is taking innovative and dangerous new approaches to successfully attacking IT infrastructure. "Our analysis has determined that, although dark_nexus reuses some Qbot and Mirai code, its core modules are mostly original," Bitdefender said in a 22-page white paper released April 8 about the attacks, "New dark_nexus IoT Botnet Puts Others to Shame." While some of its features may be shared with previously known IoT botnets, the way some of its modules have been developed makes dark_nexus significantly more potent and robust, the report said. ... "The victims won't even be aware that their devices are used as weapons against innocuous targets on the internet, even if the results might be catastrophic for victims or for the proper functioning of the internet," Botezatu said.


How Will The Cloud Impact Data Warehousing Technologies?


As data volumes continued to grow at rapid speeds, traditional relational databases and data warehouses were unable to handle the onslaught of this data. In order to circumvent this issue and ensure more efficient big data analytics systems, engineers from companies like Yahoo created Hadoop in 2006, as an Apache open source project, with a distributed processing framework which made the running of big data applications possible even on clustered platforms. Given the volume of data generated in the modern times and the advanced infrastructure required to handle it, decision support databases are facing considerable pressure to evolve, both technologically as well as architecturally. Alongside several new data warehousing architecture approaches, numerous technologies have also emerged as key contributors to modern business intelligence solutions, ranging from cloud services to data virtualization to automation and machine learning, among others. Cloud based solutions are the future of the data warehousing market. With numerous enterprises turning to the cloud to power and store their data warehousing solutions, internet companies like Amazon and Google and working tirelessly to develop and host innovative cloud-based data warehouses.


‘Unbreakable’ Smart Lock Draws FTC Ire for Deceptive Security Claims

tapplock unbreakable smart lock
“This vulnerability allowed the researchers to sniff data packets for the information necessary to authenticate their access to the lock,” the FTC explained. “With that information, researchers were able to continue accessing the lock even after their access had been revoked.” Adding insult to injury, the complaint also noted that it’s possible unlock the smart locks by simply unscrewing the back panel. In June 2018, Youtuber JerryRigEverything posted a video demonstrating how the lock could come apart using a screwdriver to loosen and pop off the back of the lock, and then open the shackle. The upshot of all of this, according to the FTC, is that Tapplock “did not take reasonable measures to secure its locks, or take reasonable precautions or follow industry best practices for protecting consumers’ personal information,” despite advertising that it did. “[Tapplock] advertised its smart locks to consumers as ‘Bold. Sturdy. Secure.,'” according to the complaint. “[Its] advertisements touted that its ‘secure’ smart locks were also…designed to be ‘unbreakable.'” The complaint added, “in fact, [Tapplock] did not have a security program prior to the discovery of the vulnerabilities.”


Keeping Vigilant for BEC Amid COVID-19 Chaos

In fact, FBI IC3 recently noted in its 2019 Internet Crime Report that BEC scams accounted for 40% of the losses for cybercrime last year. That number is likely to spike even further as criminals see BEC in the pandemic as low-lying fruit. The rapid distribution of employees to makeshift work-from-home situations, the use of unfamiliar devices, the distractions and anxiety created by illness and business disruption, have all combined to create an ideal BEC hunting ground for the bad guys. "Employees working from home are likely to be even more distracted than usual, with children, household chores, and coronavirus anxieties all competing for their attention," explains Seth Blank, vice president of standards and new technologies at Valimail. "That will make them even less attentive to the subtle clues that an email is a phishing attack. And, when working from home, they're also more likely to be using a small screen or even their cellphones to manage email, which can make some of these phish attempts — which used bogus sender identities — nearly impossible to detect." 


APT groups
The APT groups examined in this report are likely comprised of civilian contractors working in the interest of the Chinese government who readily share tools, techniques, infrastructure, and targeting information with one another and their government counterparts. The APT groups have traditionally pursued different objectives and focused on a wide array of targets; however, it was observed that there is a significant degree of coordination between these groups, particularly where targeting of Linux platforms is concerned. The research identifies two new examples of Android malware, continuing a trend seen in a previous report which examined how APT groups have been leveraging mobile malware in combination with traditional desktop malware in ongoing cross-platform surveillance and espionage campaigns. One of the Android malware samples very closely resembles the code in a commercially available penetration testing tool, yet the malware is shown to have been created nearly two years before the commercial tool was first made available for purchase.


Wanted urgently: People who know a half century-old computer language so states can process unemployment claims

Two men operating a mainframe computer, circa 1960.
On top of ventilators, face masks and health care workers, you can now add COBOL programmers to the list of what several states urgently need as they battle the coronavirus pandemic. In New Jersey, Gov. Phil Murphy has put out a call for volunteers who know how to code the decades-old computer programming language called COBOL because many of the state's systems still run on older mainframes. In Kansas, Gov. Laura Kelly said the state's Departments of Labor was in the process of modernizing from COBOL but then the virus interfered. "So they're operating on really old stuff," she said. Connecticut has also admitted that it's struggling to process the large volume of unemployment claims with its "40-year-old system comprised of a COBOL mainframe and four other separate systems." The state is working to develop a new benefits system with Maine, Rhode Island, Mississippi and Oklahoma. But the system won't be finished before next year. "Literally, we have systems that are 40-plus-years-old," New Jersey Gov. Murphy said over the weekend.


virtual data center servers
“VMware’s goal is to make NSX invaluable to the VMware installed base as those customers modernize their on-premises data-center network infrastructure and similarly seek to provide consistent network and security polices for modern applications running in public clouds," Casemore said. "As the data center becomes distributed in a multicloud world, the data-center network must become a multicloud data-center network. On the VeloCloud [VMware’s SD-WAN offering] side, the focus is on modernizing the WAN to accommodate delivery of these applications to the branch.” One new feature of NSX is the ability to control and synchronize multiple virtual networks as a single entity. Called NSX Federation, the feature lets customers set network configuration, management and policy setting across large environments. NSX Federation would let customers generate “fault tolerant zones” where they could contain network problems in a single zone, minimizing problems and preventing them from spreading, VMware stated.


Hearing test showing ear of young woman with sound waves simulation technology
The hard of hearing community has been contributing to the success of business globally in all kinds of industries. They’ve navigated the challenges of building connections even when dealing with the issues of fast paced conversations and multiple speakers in meeting settings. They’ve adapted by learning to read lips, pick up on speech patterns and build support networks with peers to help them keep pace with their fully hearing capable counterparts. Some of us may feel like this really has nothing to do with our own work experience. But based on the following items of note from the Disabled World Organization and the World Health Organization, you are bound to know, work with or even become someone who is hard of hearing. Approximately 432 million adults worldwide have a disabling hearing loss. It is estimated that by 2050, more than 900 million people will have a disabling hearing loss. There is a progressive loss of ability to hear high frequencies with increasing age known as presbycusis. 



Quote for the day:


"Great leaders go forward without stopping, remain firm without tiring and remain enthusiastic while growing" -- Reed Markham