Daily Tech Digest - August 19, 2019

Center for Data Innovation: U.S. leads AI race


“AI is the next wave of innovation, and overlooking this opportunity will pose a threat to a region’s economic and national security,” said Center for Data Innovation director Daniel Castro in a statement. “The EU has a strong talent pool and an active research community, but unless it supercharges its current AI initiatives, it will not keep pace with China and the United States.” The center chose to focus on six categories: talent, research, development, adoption, data, and hardware. Based on a 100-point scale, researchers found that the U.S. led overall with 44.2 points, China was second at 32.3, and the European Union placed third with 23.5. The study found that the U.S. shows clear leadership in four of the six categories: talent, research, development, and hardware. China leads in adoption and data. The findings would appear worrisome for the EU, which has placed great emphasis on its AI efforts in recent years. But the region does place second in four categories: talent, research, development, and adoption. Of those, it is particularly strong in the research category.



Modern Technology, Modern Mistakes

"It is common for attackers to find common utilities such as FTP clients or video conversion software, package or wrap malicious code into the installer, and then upload their packed installer to a free software download site, knowing that users may find their malicious version of the software installer before they find the legitimate original," Murphy says. Additionally, employees are increasingly feeling entitled to work from anywhere and have access to anything at any time, according to Nick Bennett, director of professional services at FireEye Mandiant. "Employees [also] feel entitled to use work assets for non-work activities, and they are bypassing protections that are in place, making themselves more susceptible to phishing attacks," Bennett says. The issue is twofold. Employees are using corporate-issued workstations for personal use, even if they are at home. When they bring that workstation back to the enterprise, they are also putting the business at risk, Bennett explains. In addition, "employees are also using non-corporate assets to access the corporate network on a device that is unmanaged by enterprise," Bennett says.


The Danger of Over-Valuing Machine Learning

Finger pressing power button with energy resources icon on earth at night background. Earth day. Environment and conservation. Energy saving concept. Elements of this image furnished by NASA
While machine learning is a powerful tool, it is not a magical box. Bias in the data or the model can train machine learning systems in inappropriate ways. Strange butterflies may be lurking within, bringing up artifacts that may be due to factors that have nothing to do with the data. The lack of transparency in the process can make determining what exactly a model returns problematic, because emergent behaviors may be lurking in the background that are often difficult to ferret out. Finally, the deeper the learning, the more energy is required to maintain multiple levels of abstraction, and that energy can often be significant enough to make using such systems uneconomical. This is not to say that such tools are useless - most of the evolution of machine learning systems in the last decade have proven highly useful and effective for a wide variety of applications, and form an integral part of the artificial intelligence toolkit. The danger comes in thinking that such systems are truly intelligent, rather than simply the clever application of high speed, and occasionally non-linear solutions.


How Can The Insurance Industry Face Its Challenges?

Technology – as in all industries – has an important role to play in the transformation of insurance. New opportunities are being created by the likes of the IoT, telematics, advanced data analytics, and technologies which support consumption-based offerings and insurance for the sharing economy. These technological advantages are particularly welcome in a sector which has faced increasing regulation. As Paton notes, many modern regulatory challenges began with the EU’s Solvency II Directive in 2009, and other decrees have followed. “The big impact of Solvency II from an innovation perspective was that it diverted the insurer’s ability to invest in new areas,” he says. “It was followed by the Insurance Distribution Directive and a regulatory scrutiny on conduct in areas such as personal lines. So now with things like point of renewal, it’s necessary to make people aware of how much their insurance premium has gone up.” “There will be more initiatives of this type,” Paton adds, “right through to changes in expectations around operational resilience, cybersecurity and third party supply management.”


Humans are the weak link: Security awareness & education still a challenge for UK companies

CSO > Weak/broken link
Although moving away from blame culture and the idea that people are the problem should be a goal of today’s security teams, most organizations still see employees as a chink in company defenses. A massive 98 percent of respondents agreed with the statement that: ‘The human employee is the weakest link when it comes to cybersecurity’. Over two-thirds agreed with this idea strongly. Social engineering, phishing, and business email compromise – all attacks which rely on people falling prey to manipulation and trickery – were listed amongst the top threats organizations are most concerned about. This dim view of the role people play in security likely feeds into why only 13 percent of organizations would rate internal cybersecurity awareness as very good. Also, 40 percent of respondents said awareness was merely adequate, suggesting there is still much work to be done around improving education, raising awareness and reducing people-based risks as a result. While organizations may still view humans as the problem, security teams also recognize that people-based problems require people-based solutions. 85 percent of the companies surveyed stated they were utilizing awareness training to reduce human error.


How to Use Data to Improve Your Sprint Retrospectives

Most agile teams do sprint retrospectives at least once a month, to iterate and improve on their software development process and workflow. However, a lot of those same teams rely only on their feelings to “know” if they have actually improved. But you need an unbiased reference system if you want to compare how two sprints went. ... Commit frequency and active days serve the same purposes. An active day is a day in which an engineer contributed code to the project, which includes specific tasks such as writing and reviewing code. Those two alternative metrics are interesting if you want to introduce a best practice to commit every day. It’s also a great way to see the hidden costs of interruptions. Non-coding tasks such as planning, meetings, and chasing down specs are inevitable. Teams often lose at least one day each week to these activities. Monitoring the commit frequency enables you to see which meetings have an impact on your team’s ability to push code. It’s important to keep in mind that pushing code is actually the primary way your team provides value to your company.


Blockchain: a friend to digital continuity and lightweight workflow tool

Blockchain: a friend to digital continuity and lightweight workflow tool image
While even the experts can’t predict when widespread adoption of the technology will place, they can suggest what it will take before widespread adoption is possible. The most important lesson is understanding: understanding that blockchain technology should work as part of an ecosystem of technologies. It doesn’t matter what industry you’re in, no matter what, it starts with customer experience. And, the CX depends on a choreography of technology. In terms of display, there has to be a web and touch experience, or an AI component with chatbots, for example. Similarly, with blockchain, it’s not just one thing that’s going to impact the use case. “One obvious pitfall of blockchain, is that people look at it as the only solution for realising anything and everything. But, this is wrong. You have to really understand the end-to-end experience and see where blockchain technology fits in,” said Jitendra Thethi, AVP, Altran.


The Digital Leader's Guide to Lean and Agile

This focus on needs and outcomes turns out to be a great way to integrate the key frameworks and models of the Lean-Agile landscape. This is a worthwhile goal; it’s worth noting that Agile is often introduced not right-to-left, but left-to-right – not needs-first, but backlog-first or solution-first. When the focus is on ploughing through backlogs of requirements, the likely result is mediocrity (or worse), hardly a great advertisement for Agile. And the dissonance of imposing solutions on teams rather than seeking to meet their and the wider organisation’s needs is potentially fatal to Agile adoptions. Right to Left should not be understood as an attack on branded process frameworks; neither does it elevate any one framework over the others. However, as well as calling into question how they are often rolled out, I do voice the regret that they are so often described in left-to-right terms, leading me to wonder how Agile is then supposed to be understood as a departure from 20th century ways of thinking and working. I demonstrate that Scrum and even SAFe are readily described in right-to-left terms – "iterated self organisation around goals" is the five-word summary


Privacy beyond HIPAA in voice technology


“When it comes to healthcare and voice design, we have several challenges we face every day,” Freddie Feldman, voice design director at Wolters Kluwer Health, said at The Voice of Healthcare Summit at Harvard Medical School last week. “HIPAA is a big topic on everyone’s mind nowadays, and it is one we take seriously. The first thing most people think about when they hear HIPAA is securing servers platforms, but there is more to it. We have to consider things like the unintended audience for a call.” He said that due to the nature of voice, even leaks not expressly prohibited by HIPAA can be inappropriate. For example, if the voice technology is intended for home use and gives a message from the radiology department to the house, then it’s giving away too much information, he said.  Much of it comes down to appropriate use. For example, putting the speakers into a hospital room setting poses a different set of challenges.  “I think as far as smart speakers and virtual assistants [go], Amazon right now only has HIPAA-eligible environments, so basically turning on and off HIPAA for specific skills, enabling HIPAA for a particular voice app or voice skill,”


Artificial Intelligence Needs a Strong Data Foundation

The largest and most basic need in the data science hierarchy is the need for data collection. While every bank and credit union collects data daily on transactions, product use, customer demographics, and even external insights from social media and other sources, an organization needs to determine what specific insight may be needed to get a complete picture. Are you collecting insight on channel use, geolocational data and consumer beliefs and behaviors? While you can build a plan for future collection, the success of any machine learning or AI initiative hinges on the scope and quality of data collected. As important as the collection of the right data is important, Rogati stresses that it is equally important to have an ongoing flow of real-time data that is easy to access, store and analyze. This can be a major challenge for financial services organizations that are notorious for having data silos. Beyond internal data flows, it is important that any external or unstructured data can also be collected, stored and analyzed. While once a major problem, cloud technology has simplified some of the storage challenges.



Quote for the day:


"To be a good leader, you don't have to know what you're doing; you just have to act like you know what you're doing." -- Jordan Carl Curtis


Daily Tech Digest - August 18, 2019

Realities and myths for 5G’s impact on logistics

Realities and myths for 5G’s impact on logistics image
The disruptive potential of 5G in logistics is all about the Internet of Things. We already see 4G and Wi-Fi networks as the ‘connective tissue’ between every device we connect to the internet, including computers, phones, wearables, home appliances and major business infrastructure. Every business relies on data to function, and logistics companies handle even more data than most. The sheer confluence of various employee functions, delivery vehicles, material handling equipment and facility control systems has always required lightning-fast connections with low latency and high uptime. 5G can deliver on that promise once it’s up and running. Individual devices will be able to achieve their own internet connections, provided they bring their own power or have access to it. Because of the far lower latency than 4G — up to 10 times lower — companies will be able to distribute and exchange far larger quantities of data than ever.



Cloud security is too important to leave to cloud providers

The need to take control of security and not turn ultimate responsibility over to cloud providers is taking hold among many enterprises, an industry survey suggests. The Cloud Security Alliance, which released its survey of 241 industry experts, identified an "Egregious 11" cloud security issues.  The survey's authors point out that many of this year's most pressing issues put the onus of security on end user companies, versus relying on service providers. "We noticed a drop in ranking of traditional cloud security issues under the responsibility of cloud service providers. Concerns such as denial of service, shared technology vulnerabilities, and CSP data loss and system vulnerabilities -- which all featured in the previous 'Treacherous 12' -- were now rated so low they have been excluded in this report. These omissions suggest that traditional security issues under the responsibility of the CSP seem to be less of a concern. Instead, we're seeing more of a need to address security issues that are situated higher up the technology stack that are the result of senior management decisions."



Asbeck estimates that it will only take a few years before labor industries adopt their use, and it won’t be long before passive suits become an affordable and commonplace use by the able-bodied. Exoskeletons could add years of enjoying an active lifestyle, like hiking, for the elderly by providing greater endurance. ... “A lot of us wear a Bluetooth device in our ear now, and that was tech you saw the military wearing in a movie fifteen years ago,” Haas said, “It looks crazy and futuristic but now we all see that type of technology.” If wearable robotics make you think less of Iron Man and more of Wall-E — where futuristic humans rely on robotic lounge chairs for mobility, rendering them too bloated to walk — David Perry, an engineer at Harvard Biodesign Lab, doesn’t show concern. He says these suits will maintain the health of people who do incredible physical feats. His leg exoskeleton, which soldiers tested, won't alleviate all the stress on the wearer’s muscles. But it does make it easier to walk by about 15%, even while carrying a heavy load.



Managing compliance costs with quality data

Quality data should allow compliance teams to screen against a range of datasets, including sanctions, politically exposed persons (PEPs), law enforcement lists, regulatory lists, adverse media records geared towards sanctions compliance, AML, countering the financing of terrorism, and countering proliferation finance. Data should also be comprehensive; de-duplicated; consistent; accurate; configurable; and up-to-date. Given this extensive list of requirements, when choosing a data provider, it is important to choose one that has global capabilities and is able to deliver reliable and trusted data.The dataset being considered should have a highly analytical set of inclusion criteria and be de-duplicated. A successful provider should allow extensive ‘slicing and dicing’ and configuration and, perhaps most importantly of all, must take the control of operational costs seriously. It is worth mentioning that overscreening – in other words screening beyond regulatory requirements – can significantly contribute to excessive operational costs.


Can private online communities drive corporate cultural change?

speech balloons speech bubbles conversation talk social media network by comicsans getty
By creating a dedicated interactive workspace for your customers, partners and company staff, you create a collaborative space built on trust. Potential customers look for – and many times rely on – current customer interaction for recommendations and support. Through customer engagement you can educate your customers on new product enhancements and product rollouts. Public online communities are typically full of distracting noise, or communication is one-way (through reviews or isolated through FAQ’s or chatbots between the customer and the brand). Although subcultures formed by joining groups can limit noise, building brand loyalty amongst the distracting noise of public online communities is still not as safe or effective. The data can’t be gathered to increase a company’s value and effectiveness or enhance product development. In a dedicated, private online community, interactive real-time networking and tribal problem-solving helps to create long-term partnerships and friendships.


Major breach found in biometrics system used by banks, UK police and defence firms

Facial recognition technology on woman
The researchers said the sheer scale of the breach was alarming because the service is in 1.5m locations across the world and because, unlike passwords being leaked, when fingerprints are leaked, you can’t change your fingerprint. “Instead of saving a hash of the fingerprint (that can’t be reverse-engineered) they are saving people’s actual fingerprints that can be copied for malicious purposes,” the researchers said in the paper. The researchers made multiple attempts to contact Suprema before taking the paper to the Guardian late last week. Early Wednesday morning (Australian time) the vulnerability was closed, but they still have not heard back from the security firm. Suprema’s head of marketing, Andy Ahn, told the Guardian the company had taken an “in-depth evaluation” of the information provided by vpnmentor and would inform customers if there was a threat. “If there has been any definite threat on our products and/or services, we will take immediate actions and make appropriate announcements to protect our customers’ valuable businesses and assets,” Ahn said.


Here’s How Artificial Intelligence Is Fueling Climate Change

AI Apocalypse
At the moment, data centers—the enormous rooms full of stacks and stacks of servers that juggle dank memes, fire tweets, your vitally important Google docs and all the other data that is stored somewhere other than on your phone and in your home computer—use about 2% of the world’s electricity. ... According to The MIT Technology Review, Dickerson recently told a conference audience in San Francisco that—unless super-efficient semiconductors are innovated in the next five years—data centers handling AI demands could account for 10% of the world’s electricity use by 2025, a hundred-fold increase in a half-decade. Dickerson’s forecast is a worst-case scenario. Other tech execs have given estimates that vary wildly. Some think data centers, period, will suck 10% of the global electricity load. Yet others think that usage will remain relatively flat, in part because of large companies’ abilities to handle vast amounts of data in more efficient ways. Google, for example, is using AI technology to cool its data centers, reducing demand for power by 40%.


On Stocks And Machine Learning

Undoubtedly, the “cognitive biases” described by Kahneman and Tversky act on and affect the decisions of even the most experienced and famed stock analysts and portfolio managers in the world. Specifically, the “Confirmation Bias” may lead analysts to purchase stocks that are well- known, popular and “juicy”. Analysts are usually “swamped” with information and data on the companies they follow which might raise their confidential level in their analysis of these companies’ stocks. The “Anchoring Bias” will make it difficult for the analyst to sell a stock that he purchased even if he discovers that he had erred in his original analysis of this stock’s performance. The “Representational Bias” may also lead the analysts to wrong investments. The problem related to the “Representational Bias” stems from the tendency of the analyst, when investigating the history and profile of the company, to assume that these parameters will repeat themselves in the future. This assumption ignores the “reversion to the mean” phenomenon which is typical for the finance market and the economic market in general.


Data management roles: Data architect vs. data engineer, others


How do these data management roles compare? Data architects design and help implement database systems and other repositories for corporate data, Bowers said. They're also responsible for ensuring that organizations comply with internal and external regulations on data, and for evaluating and recommending new technologies, he added. Bowers described a data architect as a "know-it-all" who has to be familiar with different databases and other data management tools, as well as use cases, technology costs and limitations, and industry trends. "I had to master a ton of technologies to become a data architect," he said. A data modeler identifies business rules and entities in data sets and designs data models for databases and other systems to help reduce data redundancy and improve data integration, according to Bowers. Data modelers make less money on average than many other IT workers, but you get what you pay for, he cautioned.


Data Management No Longer an IT Issue

The next-generation data management platform needs to treat data differently. It needs to see data as a liquid core asset - not a static one -- that can be quickly ingested, stored in the most appropriate data formats and locations, and easily accessed by any analytical processing engine. The data architecture should be flexible, scalable, high-performance, integrated, and secure. But this does not mean you need to create an entirely new enterprise data platform, according to Han. "The core components are still the same - applications, middleware, database, analytics, and systems. However, when we build the new data architecture on top of the existing framework, we must be aware that there are new access points like mobile and IoT for collecting data today, which did not exist 15 years ago. There is also a huge abundance of data that comes in a variety of formats today. So, the question is, how can we integrate them all?" Oracle’s Big Data SQL, an end-to-end big data and AI platform, looks at all data in unison and integrates them to maximize its value.



Quote for the day:


"Leadership is, among other things, the ability to inflict pain and get away with it - short-term pain for long-term gain." -- George Will


Daily Tech Digest - August 17, 2019

Security warning for software developers: You are now prime targets for phishing attacks


According to the Glasswall report, software developer is the role most targeted by hackers going after the technology sector. A key reason for this is that devs do the groundwork on building software and will often have administrator privileges across various systems. That's something attackers can exploit to move laterally around networks and gain access to their end goal. "As an attacker, if you can land on an administrator machine, they have privileged access and that's what the attackers are after. Software developers do have that privileged access to IP and that makes them interesting," Lewis Henderson, VP at Glasswall, told ZDNet. With software developers being technically-savvy people, some might argue that they shouldn't easily fall victim to phishing campaigns. But attackers can use specially-crafted messages to target one individual in the organisation they want to gain access to. With software developers often staying in jobs for relatively short periods of time, it's common for those in the profession to build a profile on professional social networks such as LinkedIn. Attackers can exploit that to find out the specific skills and interests of their would-be victim and tailor a spear-phishing email towards them.



Deploying Natural Language Processing for Product Reviews

We have data all around us and there are of two forms of data namely; tabular and text. If you have good statistical tools tabular data has a lot to convey. But it is really hard to get something out of the text, especially the natural language spoken text. So what is natural language? We, humans, have very complex language and natural language is the true form of human language which is spoken/written with sincerity also surpassing grammatical rules. To consider the best example where you can find this language is in “Reviews”. You write review mainly for two reasons, either you are very happy with the product or very disappointed with it and, with your reviews and a Machine Learning Algorithm, entities like Amazon can figure out whether the product they are selling is good or bad. Depending upon the results on the analysis of the reviews they can make further decisions on that product for their betterment.


Scrum is not magic and will not solve this problem. If you do not have enough skills to do the work or do a great job in the work, then it will not magically create those skills. What it will do is make that problem very evident in the Increment (stuff that is delivered), the Sprint Review, the Retrospective, Sprint Planning and the Daily Scrum. Actually, it will be evident in all of the Scrum events. Scrum might not be magic, but it does make problems very evident, encouraging the team to solve them. Skills are one set of challenges that teams face and Scrum will make them, or the lack of them very apparent to everyone. This will, however mean choices need to be made by the team and the management of the environment the team works within. There is no blaming the system with Scrum. Many teams doing Scrum describe the sensation of being on a Scrum Team like being in a startup. It is rare that a startup has all the right skills to deliver the best product, but they have enough to do something and will beg, borrow and steal the knowledge and experience to fill in the gaps.



Fintech - Regtech - How About Sales?

The good news is that compelling events such as a growing demand for regulatory compliance and digitalization are triggering and driving many new procurement initiatives within the financial institutions. The bad news is that purchasers, influencers and decision makers get overloaded with requests for meetings and presentations by numerous candidate suppliers. The apparent conflict between the interests of young technology companies and the overloaded and stressed end-user prospects and clients, resulted in the emergence of a new type of business: the technology brokerage or in other words: companies providing shared expert sales and account management services, on an international scale. With this new model, working with the rare species of expert financial technology sales becomes affordable for the technology company. At the same time the end-users can interact with a trusted but independent account manager that interfaces with different technology providers.


The history of AR and VR: from gimmick to business problem solver

The history of AR and VR: from gimmick to business problem solver image
The history of AR and VR goes back longer than anyone would have expected. When Charles Wheatstone invented the stereoscope in 1838, he didn’t know it, but his 3D image creation would spark the augmented reality and virtual reality boom that is predicted to infiltrate business and society in the next 10-15 years. While the first VR head-mounted display (HMD) was created in 1968 by computer scientist Ivan Sutherland, “there was no name for AR when we started in 2011,” says Beck Besecker, CEO, Marxent. “We called it hologram technology at the time.” ... Both technologies were viewed as quite gimmicky add ons, until opportunities emerged to apply them to tangible use cases, such as in the home vertical. But what changed? Did the technologies advance enough to add value? Or, did awareness around the benefits of the technologies improve? There’s a bunch of reasons. And, one of the main ones, is getting over the hype — the stumbling block for many emerging technologies.
Get ready for the convergence of IT and OT networking and security
Traditionally, IT and OT have had very separate roles in an organization. IT is typically tasked with moving data between computers and humans, whereas OT is tasked with moving data between “things,” such as sensors, actuators, smart machines, and other devices to enhance manufacturing and industrial processes. Not only were the roles for IT and OT completely separate, but their technologies and networks were, too. That’s changing, however, as companies want to collect telemetry data from the OT side to drive analytics and business processes on the IT side. The lines between the two sides are blurring, and this has big implications for IT networking and security teams. “This convergence of IT and OT systems is absolutely on the increase, and it's especially affecting the industries that are in the business of producing things, whatever those things happen to be,” according to Jeff Hussey, CEO of Tempered Networks, which is working to help bridge the gap between the two. “There are devices on the OT side that are increasingly networked but without any security to those networks. Their operators historically relied on an air gap between the networks of devices, but those gaps no longer exist. ..."



The true value of diversity in risk management


Looking beyond gender diversity, Molyneux, Omero, Reis, A. Merzouk, and Lani Bannach, Director of Essenta and Well U Trading, advocate for diverse teams but in a multidisciplinary way. Molyneux believes that “diversity, in all forms, is incredibly important for every business or sector. When I say “all forms”, I would even include things like cultural diversity, diversity in the level of experience, and even diversity in operating styles.” “There are several studies where a diverse workforce is proven to enrich the working environment by providing different solutions to the same problem and by opening up constructive debate, ultimately resulting in a better outcome. Companies that do not diversify lose out on competitiveness and talent”, Omero explained. “If the sector doesn’t value and embrace diversity appropriately it will lose a powerful taskforce and source of knowledge and creativity”, Reis added. “The sector is always open to new ideas and innovative solutions for old and new issues. The more diverse an environment is, the more creative and revolutionary will the business solutions be.”


Testing Microservices: Overview of 12 Useful Techniques - Part 1

Choose your testing techniques with a perspective on time to market, cost, and risk. When testing monoliths with techniques like service virtualization, you do not have to test everything together. You can instead divide and conquer, and test individual modules or coherent groups of components. You create safe and isolated environments for developers to test their work. ... When working with microservices, you have more options because microservices are deployed typically in environments that use containers like Docker. In microservice architectures, your teams are likely to use a wider variety of testing techniques. Also, since microservices communicate more over the wire, you need to test the impact of network connections more thoroughly. Using tools and techniques that better fit the new architecture can allow for faster time to market, less cost, and less risk.  Many IT departments work with or maintain systems developed and deployed in a monolithic architecture.


Flip the ratio: Taking IT from bottleneck to battle ready


One of the main reasons back-end systems demand so many resources is that they do not take advantage of agile ways of working that have become second nature to most software developers. Either back-end teams confuse “doing” agile rather than actually “being” agile, running waterfall projects using the scrum method but not working in small teams rapidly iterating on small chunks of code, or agile doesn’t even make it to the back-end teams. Even application maintenance and IT infrastructure can benefit from agile principles, which is significant, since these areas often make up 40 to 60 percent of the IT organization. By introducing true agile methods—small, cross-functional teams or squads working in rapid iterations—to relevant enterprise IT work, companies can radically reduce the resources needed to support those systems while substantially improving service quality and the potential for automation. ... By better understanding business needs, teams eliminated some demand by providing self-service options. Cross-functional teams had the people needed to not only identify the root cause of incidents but correct them immediately.


IoT Devices — Why Risk Assessment is Critical to Cybersecurity

IoT Devices cybersecurity risk assessment
Managing risk of any kind, and IoT risk, in particular, is never a one-and-done exercise. After first determining the risk category for new IoT devices or services, it is crucial to revisit this exercise on a regular basis. Changes to the IoT devices, the local area networks and the applications with which the devices interact create an ever-changing attack surface that requires constant monitoring to help maintain a strong forward-leaning security posture. Organizations should take a disciplined approach to risk categorization and mitigation across the entire IoT ecosystem. Tripwire can help you identify IoT risks by providing rigorous security assessments. Tripwire’s device testing approach includes identifying security risks and vulnerabilities that may exist in the physical construction of the device and its network interfaces. Our goal is to identify potential control exposures through security configuration analysis and vulnerability testing of the platform and the operating environment.



Quote for the day:


"There is no "one" way to be a perfect leader, but there are a million ways to be a good one." -- Mark W. Boyer


Daily Tech Digest - August 13, 2019

What is instant recovery? A way to quickly restore lost files and test backup systems

CSO > Microsoft Azure backups / cloud computing / binary code / data transfer
The first challenge is that the hypervisor is not really reading a VMDK image; it is reading a virtual image being presented to it by the backup product. Depending on which product you're using and which version of the backup you chose, the backup system may have to do quite a bit of work to present this virtual image. This is why most backup systems recommend limiting the number of instant booted images at a time if performance is important. The second reason instant recovery is not typically high-performance is that the VMDK is on secondary storage. In a world where many primary systems have gone to all-flash arrays, today's backup systems still use SATA, which is much slower. The final enemy of high-performance in an instant-recovery system is that many backups are stored in a deduplicated format. Presenting the deduplicated files as a full image takes quite a bit of processing power and again takes away from the performance of the system. Some deduplication systems can store the most recent copy in an un-deduplicated fashion making them much faster for an instant-recovery set up.



Pair Programming (PP) is an extreme programming approach to produce better software where two people work together at one computer and work is reviewed as it is done. The driver operates the keyboard while navigator is watching, asking questions, guiding, reviewing, learning and making suggestions. Find more about PP at Wikipedia. We often hear that Pair Programming is a “waste of time”, “doesn’t really work”, “suppresses creativity”, “kills privacy”, “stressful”, etc., These are all genuine concerns any team may have based on their circumstances and experience. ...PP helps in transitioning the knowledge and works great when you have new members on the team. Navigator plays a contributor role while the driver is the receiver. This approach indirectly reduces the training cost of the new members. Team members with heavy knowledge of the project tend to have more dependency, as they are knowledge-towers. It is always a good idea to spread that knowledge to others to reduce the dependency of those people. When these heavy-lifters pair with others, it helps to spread the knowledge easily.


8 features all enterprise SaaS applications must have


Reliability and security are two of the most important qualities for SaaS tools. Companies that run their software on premises are able to store corporate information in their own infrastructures, which helps them keep that sensitive data secure. However, when it comes to SaaS, the software providers are responsible for keeping user data safe. Consequently, it makes sense that security and data privacy are key capabilities in enterprise SaaS applications. Providers should also include features in their enterprise SaaS offerings that solve business issues and provide the availability and efficiency that are necessary in an increasingly challenging enterprise environment. There is little doubt that companies are looking into SaaS -- usually, in a multi-tenant model in which users from different organizations share the same instance of an application. SaaS is arguably the purest form of the cloud and the largest segment of the cloud market, with revenue expected to grow 22.2% to reach $73.6 billion this year, according to Gartner. In addition, SaaS is expected to reach 45% of total application software spending by 2021.


What Microsoft's upcoming 'outsourcing' licensing changes could mean

Microsoft's upcoming licensing change is going to be "massive" for customers who've been using AWS and Google Cloud dedicated hosts to run Windows Server and Windows client, says Directions on Microsoft's Miller. "Why? Those products never offered -- and still don't offer -- License Mobility through Software Assurance," he said.  Microsoft officials note that beginning October 1 "on-premises licenses purchased without Software Assurance and mobility rights cannot be deployed with dedicated hosted cloud services offered by the following public cloud providers: Microsoft, Alibaba, Amazon, and Google. They will be referred to as 'Listed Providers.'" On October 1, customers who already are running Microsoft on-premises software offerings from these listed providers will be able to continue to deploy and use Microsoft enterprise software under their existing licenses. But they won't be able to add workloads or upgrade to a new product version released after October 1 under their existing licenses.


How to implement edge computing

edge-computing.jpg
"Networking skills are important at the edge because you need highly skilled people who can make the decisions, such as whether they want to deploy one large network or a series of smaller, specialized networks," said Coufal. "These same network architects need to make decisions about which of their different networks under management should be federated with each other for information exchange and which they want to keep separate. In many cases, business security and information exchange requirements will dictate this." Coufal recommends that organizations take a measured approach when it comes to deploying computing at the edges of their enterprises. "This means pushing out portions of applications to the edges of your company, but not necessarily everything," he said. "You can always plan to scale out later." It's also important to place an emphasis on the security that will be needed at the edge, given that end user personnel, not necessarily IT, will be running and maintaining much of this edge computing. Finally, bandwidth is an issue. If you can place subsets of your data and your applications at the edge, the processing of data, as well as the data that is transmitted from point to point, will be faster.


A New Credential for Healthcare Security Leaders

The Certified Healthcare Information Security Leader - or CHISL - credential was created by the Association of Executives in Healthcare Information Security, a subgroup of the College of Healthcare Information Management Executives. "There are a number of security certification programs, but they are not tailored to the healthcare environment," Marsh says in an interview with Information Security Media Group. The new certification is "sculpted" for healthcare security leaders, he says. In its statement about the new credential, CHIME notes that it's modeled after the organization's Certified Healthcare CIO, or CHCIO, certification program, which is exclusively for healthcare CIOs. To earn the CHISL designation, a security executive will need to pass an exam that tests knowledge of seven domains: organizational vision and strategy; technology proficiency; change management; value assessment and management; service management; talent management; and management of security relationships.


7 trends impacting commercial and industrial IoT data

IoT-and-Computer-Networking.png
According to Gartner, within the next four years, 75% of enterprise-generated data will be processed at the edge (versus the cloud), up from <10% today. The move to the edge will be driven not only by the vast increase in data, but also the need for higher fidelity analysis, lower latency requirements, security issues and huge cost advantages. While the cloud is a good place to store data and train machine learning models, it cannot deliver high fidelity real-time streaming data analysis. In contrast, edge technology can analyze all raw data and deliver the highest-fidelity analytics, and increase the likelihood of detecting anomalies, enabling immediate reaction. A test of success will be the amount of “power” or compute capability that can be achieved in the smallest footprint possible. ... The CEP function should enable real-time, actionable analytics onsite at the industrial edge, with a user experience optimized for fast remediation by operational technology (OT) personnel. It also prepares the data for optimal ML/AI performance, generating the highest quality predictive insights to drive asset performance and process improvements.


3gpp-network-slicing-architecture-image03.jpg
"Think of 5G and network slicing. That's a can of worms!" remarked Dr. Gerhard P. Fettweis, coordinator of Germany's 5G Lab and a professor at Technische Universität Dresden. "How are you going to handle all this from an integrity, privacy, security [standpoint], knowing that your hardware is not going to be fail-proof -- because two years from now, we're going to have four major updates of the system, because we found out somebody could've been malfunctioning the system?" It isn't that AT&T, Verizon, and the successor company to the T-Mobile and Sprint merger have some suppressed, nascent desire to go into competition against Amazon, Microsoft Azure, and Google Cloud. But they may be reselling cloud capacity to companies large and small that could certainly disrupt the cloud providers' best-laid plans. These would include many of the cloud providers' largest enterprise customers, who may be willing to spend premiums on operating their own global, fiber optic cable-linked networks as though they were their own data centers.


Psychometric tests are a key weapon in battle against cyber security breaches

Cyberchology: psychometric tests are a key weapon in battle against cyber security breaches image
Phishing attacks are less likely to be effective if they are targeted at people with a preference for sensing. On the other hand, people with these personalities are more likely to take cyber security risks. There is a nuance here. It turns out that the cyber security risk takers are more likely to be people in this group who have a “preference for Perceiving and/or Extraversion. As for people who have a preference for feeling or judging, they “are more likely to fall victim to social engineering attacks than those with a preference for Thinking. But they also. tend to be more cautious and therefore more rigorous when following cyber security policies. However, the ‘Thinking’ group can over-estimate their own competence, leading to mistakes. The ESET and The Myers-Briggs Company Cyberchology report suggests that psychometric tests can be used to build self-awareness, thereby reducing vulnerability to potential cyber security breaches.


Empathy is a Technical Skill

Archeology and anthropology can give us good metaphors for what it’s like to work with software that we didn’t write ourselves. If you’re attempting to reconstruct someone else’s viewpoint, but you don’t have direct access to them, you’ll need to rely on two critical components: artifacts and context. The same applies to software. In a legacy system, we often don’t have access to the developers who initially wrote the code. So instead, we need to look at what they’ve left behind — their artifacts. Just like how pottery, skeletons, coins, foundations of buildings, and writing can help us figure out what someone’s life was like in the distant past, we can use those principles in software, too. The question to ask as you’re going about your daily work is, "Am I leaving durable evidence of my thinking that will help someone in the future?" That might be someone else after you’ve left for another role, or it could be your future self six months from now after you’ve forgotten the details of what you were working on.



Quote for the day:


"A simple but powerful rule: always give people more than what they expect to get." -- Nelson Boswel


Daily Tech Digest - August 12, 2019

Can an AI system invent? Does the tech have the intellectual right?

Can an AI system invent? image
There is presently a consensus inherent in patent law globally that the owner of a patent is the inventor unless the rights have been assigned to another person, entity, or their employer. However, the law also requires that the inventor must be a person who has contributed in some material way to the invention’s conception. Therefore, under current law, only a human is capable of being named as inventor and the AI system is a tool they have utilised to facilitate their innovation. The academics and inventors involved in the Artificial Inventor Project believe that this stance is outdated, and that such AI systems should be named as inventors with the owner of the machine being named as the owner of the patent. If indeed, AI systems such as The Creativity Machine seem to be capable of ‘inventing’, without any form of human intervention this could lead to patents without ‘inventors’. Some innovators may be concerned that the current lack of clarity regarding the patentability of AI-based inventions could become a barrier to progress. 


For Invisible Border Control, Start with Old-School Security Protocols

To minimize the risk of data breaches, the application layer is the only layer of technology within a computer that should be permitted to encrypt and decrypt sensitive information. So then, a second main point for implementers of border control security is that they should encrypt sensitive data within the application to ensure confidentiality. The encryption should be supplemented by secure key-management techniques using dedicated cryptographic hardware such as the Trusted Platform Module – a low-cost, high-security chip designed over a decade ago. Lack of such basic security controls led to breaches at thousands of companies over the last 15 years, including the U.S. Office of Personnel Management, Uber and Marriott. It would also be wise to add integrity controls to transactions through the use of digital signatures, given the fact that completely new systems are being created to support invisible boundaries. Not only are such transactions independently verifiable without the use of blockchain, but subtle, yet sophisticated attacks are possible when such security is not in place.


Democratic Presidential nominees are ignoring the issue of our cybersecurity infrastructure

securityhall
What is, in effect, another sort of breach, is the collection, aggregation and manipulation of our privacy by digital aggregators such as Google and Facebook, which is then further manipulated and stolen by criminals. How do we solve these problems? Blatantly dictating solutions would inevitably fail. What we can do successfully is set standards of performance and responsibility, coupled with timelines and severe penalties for failure to perform. There must be accountability –something that sometimes exists in industry (albeit at inadequate levels), but that is wholly missing in government at all levels. While I care deeply about cybersecurity, I am not naïve about the extreme pressure confronting politicians to score well in polls – a requirement to have a shot at winning their party’s presidential nomination. Arguably, cybersecurity awareness may not fit this bill. If enhanced cybersecurity is to be injected into the Democratic election agenda, the public must actively promulgate such a step. Supporting an outcry is the irrefutable fact that the signs of risk are flagrant.


Modern-Day SOCs: People, Process & Technology

Part of building a SOC also requires organizations to decide whether it will be an internal, external, or hybrid. Each has its pros and cons. The upsides to an internal SOC include the assurance that comes with it being staffed by employees who are familiar with the organization's infrastructure and understand its security posture. That said, making an internal SOC successful comes at a cost.  A more cost-friendly route could be contracting an external party to deliver SOC services, according to Durbin. "An external SOC has the advantage of minimal initial outlay costs and reduced running costs due to the economies of scale associated with outsourcing," he says. "However, it is also important for organizations to recognize that they retain responsibility for the SOC and therefore need to keep SOC governance in-house." Members of ISF have expressed to Durbin that a hybrid SOC offers "the best of both worlds" by addressing some of the limitations that can encumber the performance of an internal or external SOC, he says.


Ransomware attacks are getting more ambitious as crooks target shared files


Despite a rise in ransomware attacks against cloud and network services – which in some cases see attackers make off with hundreds of thousands of dollars – organizations can prevent themselves from becoming the next victim. "It is hard to stop, but it can be defeated. There are many precursor signs to a ransomware attack that can be detected and responded to, before a ransomware attack succeeds," said Morales. "Continuous monitoring for network behaviors to proactively detect and respond to attacks does give an organization an opportunity to save themselves from the loss of data," he added. Organizations can also go a long way to avoid falling victim to a ransomware attack by ensuring that systems that don't need to be facing the open internet aren't remotely accessible, and by applying security updates to prevent malware taking advantage of vulnerabilities. Businesses should also keep regularly updated offline backups of their data, so if the worst does happen, the systems can be restored without giving into the demands of cyber criminals.


The Intel Assembly Manual

Reading this through will enable you to understand how the operating systems work, how the memory is allocated and addressed and, perhaps how to make your own OS-level drivers and applications. To help you understand what's happening, the github project includes many aspects of the article (and I 'm still adding stuff). It's a ready to be run tool which includes a Bochs binary, VMWare and VirtualBox configurations and a Visual Studio solution. The entire project is build in assembly using Flat Assembler. Assemblers like TASM or MASM will not work, for they only support specific architectures. Bochs is the best environment to experiment, because it includes a hardware GUI debugger which can help you understand the internals. Debugging without Bochs is impossible, because the debuggers are either real mode only (like MSDOS Debug) and assume you will always have some sort of control, or are able to run only in an existing environment.


Researchers find security flaws in 40 kernel drivers from 20 vendors

kernel socket driver
The common design flaws is that low-privileged applications can use legitimate driver functions to execute malicious actions in the most sensitive areas of the Windows operating system, such as the Windows kernel. "There are a number of hardware resources that are normally only accessible by privileged software such as the Windows kernel and need to be protected from malicious read/write from userspace applications," Mickey Shkatov, Principal Researcher at Eclypsium told ZDNet in an email earlier this week. "The design flaw surfaces when signed drivers provide functionality which can be misused by userspace applications to perform arbitrary read/write of these sensitive resources without any restriction or checks from Microsoft," he added. Shkatov blames the issues he discovered on bad coding practices, which don't take security into account. "This is a common software design anti-pattern where, rather than making the driver only perform specific tasks, it's written in a flexible way to just perform arbitrary actions on behalf of userspace," he told ZDNet.


A billionaire software mogul doesn't want his company to grow up

While SAP may be Plattner’s primary obsession, the software mogul has used his considerable wealth (he is the fifth-richest German with a net worth of about $15 billion) to finance his educational, philanthropic and sporting ventures. Plattner built a museum in Potsdam on the outskirts of Berlin to house his art collection, and financed the Hasso Plattner Institute in the same city, a vast IT campus that churns out software engineers. Investors have criticized SAP for being too slow to rejuvenate its executive suite, and for relying too heavily on Plattner to drive innovation. (Plattner, because he’s limited in what he’s allowed to do as chairman, also advises SAP on technology issues). In response, the company can point to some recent high-profile promotions of younger talent. One is Plattner’s protege Juergen Mueller, SAP’s 37-year-old chief technology officer. Mueller, a graduate of Plattner’s HPI, has been pushing artificial intelligence at SAP.


At A Glance – Doxxing


Doxxing is one of many threats businesses face however, it isn’t always carried out with malicious intent. Doxxers can aid the police and emergency services by uncovering the identity of criminals, reveal the true personas behind abusive or harmful content, and discourage people from engaging in illegal or socially taboo online forums. In one well known example, a Reddit user called ‘violentacrez’ fell foul of doxxing carried out by an American journalist. Worried that their true identity would be revealed, violentacrez deleted their account. It was too late. Violentacrez, the online identity used by Michael Brutsch, has been at the centre of a controversial debate over misogyny and unsavoury internet use for over 10 years. Organisations may even use doxxing for business research and analysis but this is not generally seen as an advisable or legitimate use. Doxxing does have serious implications for business as part of an ever growing cyber threat. Organisations should make it a priority to educate stakeholders and safeguard against such attacks.


6 Security Considerations for Wrangling IoT

The sheer increase in the volume of consumer IoT fostered by retail and tech giants has created a massive attack surface. Consumers may have dozens of IoT devices in their homes. And with all of their variations in software, suppliers, and connection points, the possibilities for things to go wrong seem endless. For instance, the simple task of turning on your home security system (an IoT device that communicates with a server), driving your car (your phone or car could also be an IoT device), and using a streaming camera at home seems innocuous on their own, but the data may be tracked by various parties, and combining them causes alarming possibilities of potential malicious activity. To better ensure safety and security, education is needed across the entire IoT ecosystem — from consumers to device manufacturers, service providers, third parties, and developers. Findings show the top reasons for IoT security vulnerabilities include weak passwords, insecure web APIs, cloud and mobile interfaces, insecure third parties, network services, and data transfer to name a few.



Quote for the day:


"Remember: Rewards come in action, not in discussion." -- Tony Robbins