Daily Tech Digest - August 21, 2019

Handing Over the (Digital) Keys: Should You Trust a Smart Lock?


Inherent security flaws that lead to hacks aren’t the only avenue third parties can use to eye your data. Sometimes, it hits a little closer to home. If you have access to the app that controls a smart lock, you can probably see someone leaves and enters for the day, which can be beneficial in knowing your significant other made it home safely. But it could also inform someone of your whereabouts. Technically, if you don’t own the lock, the owner might be able to see your information, too “If a lock is connected to the internet, then there is always the danger that it could be hacked,” Ray Walsh, digital privacy expert for ProPrivacy.com, said in an email to Reviews.com. “Of course, an internet-connected smart lock may be able to feed its owner additional information – such as an alert when someone unlocks it. This data certainly has its merits, but may only be so useful in the end,” Walsh said. For example, although the privacy policy has since changed, Gizmodo found that smart lock company Latch stated GPS information could be stored and shared with owners and any subsequent owners in an archived link from May 8th.


Don’t get woken up for something a computer can do for you; computers will do it better anyway. The best thing to come our way in terms of automation is all the cloud tooling and approaches we now have. Whether you love serverless or containers, both give you a scale of automation that we previously would have to hand roll. Kubernetes monitors the health checks of your services and restarts on demand; it will also move your services when "compute" becomes unavailable. Serverless will retry requests and hook in seamlessly to your cloud provider’s alerting system. These platforms have come a long way, but they are still only as good as the applications we write. We need to code with an understanding of how they will be run, and how they can be automatically recovered. ... There are also techniques for dealing with situations when an outage is greater than one service, or if the scale of the outage is not yet known. One such technique is to have your platform running in more than one region, so if you see issues in one region, then you can failover to another region.


In July, Reuters reported that as part of an effort to combat money laundering, Japan’s government is “leading a global push” to set up for cryptocurrency exchanges a system like SWIFT, the international messaging protocol that banks use for bank-to-bank payments. Last week, a report from Nikkei suggested that 15 governments are planning to create a system for collecting and sharing personal data on cryptocurrency users.  But several people familiar with the FATF-led international discussions around cryptocurrency regulation told MIT Technology Review that these reports don’t have it quite right. There doesn’t appear to be a government-led global cryptocurrency surveillance system in the works—at least not yet. And it’s likely that whatever does eventually emerge won’t look much like SWIFT. Exchanges are still early in the process of figuring out what systems and technologies to use to securely handle sensitive data, Spiro says, and how to do it in a way that complies with a range of local privacy rules. “There are a lot of balls in the air,” he says.


Security concerns blocking UK digital transformation


“Protection and prevention are still paramount yet, to stay ahead of these evolving trends, organisations need to start thinking differently about cyber security. Business leaders need to make the leap from seeing cyber security as only a protective measure, to it also being a strategic value driver,” he said. The report also shows that across many organisations, chief information officers (CIOs) and wider board member views around cyber security are not yet aligned. Business leaders such as the CEO, CFO and COO tend to be less confident about their organisation’s cyber security than those with direct responsibility for IT and technology such as the CIO and chief information security officer (CISO). In addition, technology leaders are more likely to believe it is important for competitive advantage to have a cyber-secure brand (82%), compared with only 68% of business leaders.


Use of Facial Recognition Stirs Controversy

Use of Facial Recognition Stirs Controversy
Over the past several years, the use of facial recognition - along with other technologies such as machine learning, artificial intelligence and big data - has stoked global invasion of privacy fears. In the U.S., the American Civil Liberties Union has taken aim at Amazon's Rekognition product, which uses a number of technologies to enable its users to rapidly run searches against facial databases. The ACLU's Nicole Ozer last year called for guarding against supercharged surveillance before it's used to track protesters, target immigrants and spy on entire neighborhoods. More recently, city officials in San Francisco and Oakland have banned police from using facial recognition technology. The debate over facial recognition technology has also been addressed by several U.S. presidential candidates. On Monday, Democratic hopeful Bernie Sanders became the first presidential candidate to call for a ban on the use of facial recognition by law enforcement. This is one part of a larger criminal justice reform package that the Vermont senator's campaign calls "Justice and Safety for All."


Extreme Programming in Agile – A Practical Guide for Project Managers

extreme programming aquarium example
The XP lifecycle can be explained concerning the Weekly Cycle and Quarterly Cycle. To begin with, the customer defines the set of stories. The team estimates the size of each story, which along with relative benefit as estimated by the customer, indicate the relative value used to prioritize the stories. In case, some stories cannot be estimated by the team due to unclear technical considerations involved, they can introduce a Spike. Spikes are referred to as short, time frames for research and may occur before regular iterations start or along with ongoing iterations. Next comes the release plan: The release plan covers the stories that will be delivered in a particular quarter or release. At this point, the weekly cycles begin. The start of each weekly cycle involves the team and the customer meeting up to decide the set of stories to be realized that week. Those stories are then broken into tasks to be completed within that week. The weekends with a review of the progress to date between the team and the customer. This leads to the decision if the project should continue or if sufficient value has been delivered.


Breakthroughs bring a quantum Internet closer

3 nodes and wires servers hardware
The TUM quantum-electronics breakthrough is just one announced in the last few weeks. Scientists at Osaka University say they’ve figured a way to get information that’s encoded in a laser-beam to translate to a spin state of an electron in a quantum dot. They explain, in their release, that they solve an issue where entangled states can be extremely fragile, in other words, petering out and not lasting for the required length of transmission. Roughly, they explain that their invention allows electron spins in distant, terminus computers to interact better with the quantum-data-carrying light signals. “The achievement represents a major step towards a ‘quantum internet,’ the university says. “There are those who think all computers, and other electronics, will eventually be run on light and forms of photons, and that we will see a shift to all-light,” I wrote earlier this year. That movement is not slowing. Unrelated to the aforementioned quantum-based light developments, we’re also seeing a light-based thrust that can be used in regular electronics too. Engineers may soon be designing with small photon diodes that would allow light to flow in one direction only, says Stanford University in a press release.


Automated machine learning or AutoML explained

Automated machine learning or AutoML explained
Automated machine learning, or AutoML, aims to reduce or eliminate the need for skilled data scientists to build machine learning and deep learning models. Instead, an AutoML system allows you to provide the labeled training data as input and receive an optimized model as output. There are several ways of going about this. One approach is for the software to simply train every kind of model on the data and pick the one that works best. A refinement of this would be for it to build one or more ensemble models that combine the other models, which sometimes (but not always) gives better results. A second technique is to optimize the hyperparameters of the best model or models to train an even better model. Feature engineering is a valuable addition to any model training. One way of de-skilling deep learning is to use transfer learning, essentially customizing a well-trained general model for specific data. Transfer learning is sometimes called custom machine learning, and sometimes called AutoML (mostly by Google). Rather than starting from scratch when training models from your data, Google Cloud AutoMLimplements automatic deep transfer learning and neural architecture search for language pair translation, natural language classification, and image classification.


Considerations for choosing enterprise mobility tools


One option is to use an open source enterprise mobility management (EMM) platform. If the organization is willing to invest in the resources, open source EMM offers the flexibility to customize and extend the source code to match specific needs. IT pros should be aware of challenges that can come with maintaining their own open source EMM, such as hidden costs of deployment and lack of support. A few options for open source EMM include WSO2 Enterprise Mobility Manager or Teclib's Flyve MDM. WSO2's offering includes enterprise mobility tools such as mobile application management and mobile identity management. It also includes open source support for IoT devices, such as enrollment and application management, through IoT Server. Organizations looking for more established enterprise mobility tools can look to UEM platforms including Citrix Workspace, VMware Workspace One, IBM MaaS360, BlackBerry Unified Endpoint Manager, MobileIron UEM or Microsoft Enterprise Mobility + Security, which includes Intune.


The Future Enterprise Architect


Archie II understands the needs of decision makers throughout the organization including the need to provide timely, if not on-demand, decision support based on solid information and analysis. Archie II also understands that he must not only support the decision making processes in the organization, but also to enable those decisions by providing guidance. Archie II is proactive and is often ready with answers before questions arrive. Archie II uses or adapts existing architectures, and/or creates new architectural patterns and models to support analysis he performs in order to make recommendations needed as value chains or value streams progress. Archie II collects just enough information, resulting in just enough architecture, to support the decisions at hand that match the cadence of the business. Yet Archie II is continuously listening, evolving and analyzing his models of the enterprise as new information becomes available. He proactively connects with those necessary, when necessary. His calls are always returned as he has the reputation of “when Archie II speaks, we need to listen!”



Quote for the day:


"Your greatest area of leadership often comes out of your greatest area of pain and weakness." -- Wayde Goodall


Daily Tech Digest - August 20, 2019

Blockchain is not a magic bullet for security. Can it be trusted?

Bitcoin's vulnerabilities have already been successfully exploited in significant hacks.
As with any technology, security issues arise when developers program requirements into products and services. The lines of code, consensus mechanisms, communication protocols, etc., all have the potential to host vulnerabilities that can be exploited for malicious use. But blockchain at the moment remains a divergent technology: multiple protocols and programming languages are being developed in parallel. As a result, it is difficult for developers to acquire the experience needed to secure their code, while most are under stringent time pressure to deliver. Because blockchain relies heavily on cryptography, the practice of secure communication, it gives many the impression that it’s a self-secured technology. This could not be further from the truth, as blockchains are built on top of communication networks and equipment that need to be secured. Traditional information security challenges apply to blockchain, too. Furthermore, cryptography is, like any other security discipline, a changing field: quantum computers are already expected to break a number of cryptographic algorithms.



How to unlock the true value of data


Central to a hub architecture will be the technologies used to get data flowing into it from applications and other data sources, and then provisioning outward to consumers – internal just as much as external. These might include extract, transform and load (ETL) tools that support bulk or batch movement of data, data replication and data virtualisation. They can also include app-integration middleware, such as the enterprise service bus, and message-oriented technologies that move data around in the form of message constructs. Whatever tools are used, on-premise and cloud service versions are available to tap, and there are still other elements to consider, such as governance tools to help with data compliance and metadata management tools to tag and manage data flows better. One of the big headaches for those tasked with developing a business’s data architecture is control.


Texas Pummeled by Coordinated Ransomware Attack

Texas Pummeled by Coordinated Ransomware Attack
In an updated statement released Saturday, DIR said the total victim count stood at 23 organizations. The Texas Military Department as well as Texas A&M University System's Cyberresponse and Security Operations Center teams "are deploying resources to the most critically impacted jurisdictions," it added. The U.S. Department of Homeland Security as well as FBI's cyber division, among others, have also been assisting with the response. "At this time, the evidence gathered indicates the attacks came from one single threat actor. Investigations into the origin of this attack are ongoing; however, response and recovery are the priority at this time," DIR said. "It appears all entities that were actually or potentially impacted have been identified and notified." Systems and networks run by the state of Texas have not been disrupted, DIR says. Officials in Austin said their systems were unaffected by the attack. "We are monitoring the situation," Bryce Bencivengo, a spokesman for Austin's Office of Homeland Security and Emergency Management, told local NPR member station KUT.


Value Engineering: The Secret Sauce for Data Science Success

If what your organization seeks is to exploit the potential of data science to power your business models; then the Data Science Value Engineering Framework provides the “How” the organization can do it. The Value Engineering Framework starts with the identification of a key business initiative that not only determines the sources of value, but also provides the framework for a laser-focus on delivering business value and relevance. A diverse set of stakeholders is beneficial because it provides more perspectives on the key decisions upon which the data science effort needs to focus. The heart of the Data Science Value Engineering Framework is the collaboration with the different stakeholders to identify, validate, value and prioritize the key decisions (use cases) that they need to make in support of the targeted business initiative. After gaining a thorough understanding of the top priority decisions (use cases) the analytics, data, architecture and technology conversations now have a frame within which to work (by understanding what’s important AND what’s not important).


People will usually follow those who have the most positional authority and concomitant control of resources, but also will follow those with other forms of power, such as eloquence, passion, sincerity, commitment, and charisma. In the teams I work with, people tend to pay the most attention to and be most influenced by those with both types of power. But sometimes people will even choose to follow less senior individuals if they have inspiring ideas and energy. No matter how powerful they are, though, when people show themselves to be untrustworthy, through something they do inside or outside the team, their influence vanishes. Others might still pay attention to them, but now only for transactional purposes. Those “leaders” are no longer really leading. If you want to be a real leader, one with voluntary followers, remember that you must earn and keep your people’s trust. They will carefully assess your attitude and actions, in particular whether you look out for others in addition to yourself. If their assessment is that you are trustworthy, they’ll stick with you.


These robot snakes designed by AI could be the next big thing in the operating theatre


In order to create snakebots that work in the confines of each individual's anatomy, the QUT team generate tens of virtual versions of the snakebot and set evolutionary algorithms to work on them, in a survival-of-the-fittest contest designed to create the best available bot.  First, the patient's knee is scanned by CT or MRI and a model of its internal anatomy is modelled. Alongside the surgeon, the QUT system then delineates which parts of the knee the surgeon will need to reach during the operation, as well as the parts they need to avoid. ... Afterwards, they're ranked according to their performance. Then the evolutionary algorithm refines the better-performing designs according to the results of the simulation, running the simulation over and over again, and tweaking the winning bots and rejecting those that aren't up to scratch. "This is copying what's been observed in nature and the process of evolution but you do it inside the computer... we kill off the ones that didn't do very well and we mate the ones that do well. Mating, in this case, means you combine half the characteristics of one and half of the other. You do random mutations on some of them - change a few little bits during the mating."


What Cybersecurity Trends Should We Expect From The Rest Of 2019?

uncaptioned
Currently, the industry standard for security relies on two-factor authentication when users choose to log into the software. While many email services and social media sites only ask for one form of authentication, two-factor authentication is the future. However, by the time companies adopt this, multifactor will have taken off. Most data breaches are caused by leveraging bad passwords. Weak, stolen or default passwords are usually the biggest culprits for a data leak. Single authentication allows this to happen since passwords can be limited to just something you know. By giving out a dongle or integrating an app with temporary passwords that expire, you can ensure that only verified users get access. Since more people than ever are worried about stolen identities, we should see this kind of authentication process take off in coming years. ... Companies are even finding ways to deceive potential hackers. By imitating your company's more critical data and assets, this bait can act as a trap for anyone trying to get ahold of your data.


Case Study: Improving ID and Access Management

A few years ago, Molina Healthcare was using a homegrown solution to onboard and offboard users daily in batches from the company's HR system into Active Directory, she says. But the company was growing quickly, so the mostly manual process of provisioning and de-provisioning access to Molina's systems was time-consuming, Sankepally says in an interview with Information Security Media Group. "With the increasing demands, we couldn't complete all the business processes involved, and there was a lack of standards," she says. "Our onboarding process was taking 10 to 20 days." As a result, the company made a move to standardize and automate its ID and access management platform, choosing to implement technology from SailPoint Technologies, she says. "Today we have more than 15,000 active identities supporting 15 different states with different lines of business ... including caregivers on the ground." For onboarding users, the company now has a "near real-time integration" with its cloud-based HR system that has automated the onboarding and offboarding process, she says.


GDPR faces growing pains across Europe


European countries have clearly demonstrated different strategies on penalties. Also, they have set up different structures for implementing the regulations. In Germany, for example, DPAs are organised on a German state level – but there is also a separate DPA at federal level, with jurisdiction over telecom and postal service companies. The result is that Germany has 17 data protection authorities, instead of just one. Another area where European countries disagree is in their interpretations of some of the finer points of GDPR. For example, Austria’s DPA ruled that all a data controller has to do in response to a request for data deletion is to remove individual references to that data. Nations have also demonstrated differences of opinion on how to calculate fines. For example, some local legal authorities in Germany have argued that the GDPR fines imposed in that country should be calculated according to German law, which would result in much lower fines than those imposed at the European level.


Visa Adds New Fraud Disruption Measures

Visa now is adding fraud disruption to supplement its transaction fraud detection and remediation efforts. The company today at the Visa US Security Summit 2019 in San Francisco outlined five new capabilities it now uses to prevent fraudulent transactions. "We're looking to identify and disrupt fraud before it happens," says David Capezza, senior director of payment fraud disruption at Visa. "We want to take a more proactive approach and identify these attacks and shut them down before they occur." Rivka Gewirtz Little, research director for global payment strategies at IDC, says Visa's new approach blends both its cyber and fraud units. "Typically, organizations are focused on the transaction," Gewirtz Little says. "What's interesting here is that Visa is creating a true cyber fraud system where the cyber team and fraud teams are integrated: the cyber team focuses on the attack against the enterprise and the fraud team looks at ways of preventing the attack. It's not always the same set of tools, the same team and objectives."



Quote for the day:


"Leadership offers an opportunity to make a difference in someone's life, no matter what the project." -- Bill Owens


Daily Tech Digest - August 19, 2019

Center for Data Innovation: U.S. leads AI race


“AI is the next wave of innovation, and overlooking this opportunity will pose a threat to a region’s economic and national security,” said Center for Data Innovation director Daniel Castro in a statement. “The EU has a strong talent pool and an active research community, but unless it supercharges its current AI initiatives, it will not keep pace with China and the United States.” The center chose to focus on six categories: talent, research, development, adoption, data, and hardware. Based on a 100-point scale, researchers found that the U.S. led overall with 44.2 points, China was second at 32.3, and the European Union placed third with 23.5. The study found that the U.S. shows clear leadership in four of the six categories: talent, research, development, and hardware. China leads in adoption and data. The findings would appear worrisome for the EU, which has placed great emphasis on its AI efforts in recent years. But the region does place second in four categories: talent, research, development, and adoption. Of those, it is particularly strong in the research category.



Modern Technology, Modern Mistakes

"It is common for attackers to find common utilities such as FTP clients or video conversion software, package or wrap malicious code into the installer, and then upload their packed installer to a free software download site, knowing that users may find their malicious version of the software installer before they find the legitimate original," Murphy says. Additionally, employees are increasingly feeling entitled to work from anywhere and have access to anything at any time, according to Nick Bennett, director of professional services at FireEye Mandiant. "Employees [also] feel entitled to use work assets for non-work activities, and they are bypassing protections that are in place, making themselves more susceptible to phishing attacks," Bennett says. The issue is twofold. Employees are using corporate-issued workstations for personal use, even if they are at home. When they bring that workstation back to the enterprise, they are also putting the business at risk, Bennett explains. In addition, "employees are also using non-corporate assets to access the corporate network on a device that is unmanaged by enterprise," Bennett says.


The Danger of Over-Valuing Machine Learning

Finger pressing power button with energy resources icon on earth at night background. Earth day. Environment and conservation. Energy saving concept. Elements of this image furnished by NASA
While machine learning is a powerful tool, it is not a magical box. Bias in the data or the model can train machine learning systems in inappropriate ways. Strange butterflies may be lurking within, bringing up artifacts that may be due to factors that have nothing to do with the data. The lack of transparency in the process can make determining what exactly a model returns problematic, because emergent behaviors may be lurking in the background that are often difficult to ferret out. Finally, the deeper the learning, the more energy is required to maintain multiple levels of abstraction, and that energy can often be significant enough to make using such systems uneconomical. This is not to say that such tools are useless - most of the evolution of machine learning systems in the last decade have proven highly useful and effective for a wide variety of applications, and form an integral part of the artificial intelligence toolkit. The danger comes in thinking that such systems are truly intelligent, rather than simply the clever application of high speed, and occasionally non-linear solutions.


How Can The Insurance Industry Face Its Challenges?

Technology – as in all industries – has an important role to play in the transformation of insurance. New opportunities are being created by the likes of the IoT, telematics, advanced data analytics, and technologies which support consumption-based offerings and insurance for the sharing economy. These technological advantages are particularly welcome in a sector which has faced increasing regulation. As Paton notes, many modern regulatory challenges began with the EU’s Solvency II Directive in 2009, and other decrees have followed. “The big impact of Solvency II from an innovation perspective was that it diverted the insurer’s ability to invest in new areas,” he says. “It was followed by the Insurance Distribution Directive and a regulatory scrutiny on conduct in areas such as personal lines. So now with things like point of renewal, it’s necessary to make people aware of how much their insurance premium has gone up.” “There will be more initiatives of this type,” Paton adds, “right through to changes in expectations around operational resilience, cybersecurity and third party supply management.”


Humans are the weak link: Security awareness & education still a challenge for UK companies

CSO > Weak/broken link
Although moving away from blame culture and the idea that people are the problem should be a goal of today’s security teams, most organizations still see employees as a chink in company defenses. A massive 98 percent of respondents agreed with the statement that: ‘The human employee is the weakest link when it comes to cybersecurity’. Over two-thirds agreed with this idea strongly. Social engineering, phishing, and business email compromise – all attacks which rely on people falling prey to manipulation and trickery – were listed amongst the top threats organizations are most concerned about. This dim view of the role people play in security likely feeds into why only 13 percent of organizations would rate internal cybersecurity awareness as very good. Also, 40 percent of respondents said awareness was merely adequate, suggesting there is still much work to be done around improving education, raising awareness and reducing people-based risks as a result. While organizations may still view humans as the problem, security teams also recognize that people-based problems require people-based solutions. 85 percent of the companies surveyed stated they were utilizing awareness training to reduce human error.


How to Use Data to Improve Your Sprint Retrospectives

Most agile teams do sprint retrospectives at least once a month, to iterate and improve on their software development process and workflow. However, a lot of those same teams rely only on their feelings to “know” if they have actually improved. But you need an unbiased reference system if you want to compare how two sprints went. ... Commit frequency and active days serve the same purposes. An active day is a day in which an engineer contributed code to the project, which includes specific tasks such as writing and reviewing code. Those two alternative metrics are interesting if you want to introduce a best practice to commit every day. It’s also a great way to see the hidden costs of interruptions. Non-coding tasks such as planning, meetings, and chasing down specs are inevitable. Teams often lose at least one day each week to these activities. Monitoring the commit frequency enables you to see which meetings have an impact on your team’s ability to push code. It’s important to keep in mind that pushing code is actually the primary way your team provides value to your company.


Blockchain: a friend to digital continuity and lightweight workflow tool

Blockchain: a friend to digital continuity and lightweight workflow tool image
While even the experts can’t predict when widespread adoption of the technology will place, they can suggest what it will take before widespread adoption is possible. The most important lesson is understanding: understanding that blockchain technology should work as part of an ecosystem of technologies. It doesn’t matter what industry you’re in, no matter what, it starts with customer experience. And, the CX depends on a choreography of technology. In terms of display, there has to be a web and touch experience, or an AI component with chatbots, for example. Similarly, with blockchain, it’s not just one thing that’s going to impact the use case. “One obvious pitfall of blockchain, is that people look at it as the only solution for realising anything and everything. But, this is wrong. You have to really understand the end-to-end experience and see where blockchain technology fits in,” said Jitendra Thethi, AVP, Altran.


The Digital Leader's Guide to Lean and Agile

This focus on needs and outcomes turns out to be a great way to integrate the key frameworks and models of the Lean-Agile landscape. This is a worthwhile goal; it’s worth noting that Agile is often introduced not right-to-left, but left-to-right – not needs-first, but backlog-first or solution-first. When the focus is on ploughing through backlogs of requirements, the likely result is mediocrity (or worse), hardly a great advertisement for Agile. And the dissonance of imposing solutions on teams rather than seeking to meet their and the wider organisation’s needs is potentially fatal to Agile adoptions. Right to Left should not be understood as an attack on branded process frameworks; neither does it elevate any one framework over the others. However, as well as calling into question how they are often rolled out, I do voice the regret that they are so often described in left-to-right terms, leading me to wonder how Agile is then supposed to be understood as a departure from 20th century ways of thinking and working. I demonstrate that Scrum and even SAFe are readily described in right-to-left terms – "iterated self organisation around goals" is the five-word summary


Privacy beyond HIPAA in voice technology


“When it comes to healthcare and voice design, we have several challenges we face every day,” Freddie Feldman, voice design director at Wolters Kluwer Health, said at The Voice of Healthcare Summit at Harvard Medical School last week. “HIPAA is a big topic on everyone’s mind nowadays, and it is one we take seriously. The first thing most people think about when they hear HIPAA is securing servers platforms, but there is more to it. We have to consider things like the unintended audience for a call.” He said that due to the nature of voice, even leaks not expressly prohibited by HIPAA can be inappropriate. For example, if the voice technology is intended for home use and gives a message from the radiology department to the house, then it’s giving away too much information, he said.  Much of it comes down to appropriate use. For example, putting the speakers into a hospital room setting poses a different set of challenges.  “I think as far as smart speakers and virtual assistants [go], Amazon right now only has HIPAA-eligible environments, so basically turning on and off HIPAA for specific skills, enabling HIPAA for a particular voice app or voice skill,”


Artificial Intelligence Needs a Strong Data Foundation

The largest and most basic need in the data science hierarchy is the need for data collection. While every bank and credit union collects data daily on transactions, product use, customer demographics, and even external insights from social media and other sources, an organization needs to determine what specific insight may be needed to get a complete picture. Are you collecting insight on channel use, geolocational data and consumer beliefs and behaviors? While you can build a plan for future collection, the success of any machine learning or AI initiative hinges on the scope and quality of data collected. As important as the collection of the right data is important, Rogati stresses that it is equally important to have an ongoing flow of real-time data that is easy to access, store and analyze. This can be a major challenge for financial services organizations that are notorious for having data silos. Beyond internal data flows, it is important that any external or unstructured data can also be collected, stored and analyzed. While once a major problem, cloud technology has simplified some of the storage challenges.



Quote for the day:


"To be a good leader, you don't have to know what you're doing; you just have to act like you know what you're doing." -- Jordan Carl Curtis


Daily Tech Digest - August 18, 2019

Realities and myths for 5G’s impact on logistics

Realities and myths for 5G’s impact on logistics image
The disruptive potential of 5G in logistics is all about the Internet of Things. We already see 4G and Wi-Fi networks as the ‘connective tissue’ between every device we connect to the internet, including computers, phones, wearables, home appliances and major business infrastructure. Every business relies on data to function, and logistics companies handle even more data than most. The sheer confluence of various employee functions, delivery vehicles, material handling equipment and facility control systems has always required lightning-fast connections with low latency and high uptime. 5G can deliver on that promise once it’s up and running. Individual devices will be able to achieve their own internet connections, provided they bring their own power or have access to it. Because of the far lower latency than 4G — up to 10 times lower — companies will be able to distribute and exchange far larger quantities of data than ever.



Cloud security is too important to leave to cloud providers

The need to take control of security and not turn ultimate responsibility over to cloud providers is taking hold among many enterprises, an industry survey suggests. The Cloud Security Alliance, which released its survey of 241 industry experts, identified an "Egregious 11" cloud security issues.  The survey's authors point out that many of this year's most pressing issues put the onus of security on end user companies, versus relying on service providers. "We noticed a drop in ranking of traditional cloud security issues under the responsibility of cloud service providers. Concerns such as denial of service, shared technology vulnerabilities, and CSP data loss and system vulnerabilities -- which all featured in the previous 'Treacherous 12' -- were now rated so low they have been excluded in this report. These omissions suggest that traditional security issues under the responsibility of the CSP seem to be less of a concern. Instead, we're seeing more of a need to address security issues that are situated higher up the technology stack that are the result of senior management decisions."



Asbeck estimates that it will only take a few years before labor industries adopt their use, and it won’t be long before passive suits become an affordable and commonplace use by the able-bodied. Exoskeletons could add years of enjoying an active lifestyle, like hiking, for the elderly by providing greater endurance. ... “A lot of us wear a Bluetooth device in our ear now, and that was tech you saw the military wearing in a movie fifteen years ago,” Haas said, “It looks crazy and futuristic but now we all see that type of technology.” If wearable robotics make you think less of Iron Man and more of Wall-E — where futuristic humans rely on robotic lounge chairs for mobility, rendering them too bloated to walk — David Perry, an engineer at Harvard Biodesign Lab, doesn’t show concern. He says these suits will maintain the health of people who do incredible physical feats. His leg exoskeleton, which soldiers tested, won't alleviate all the stress on the wearer’s muscles. But it does make it easier to walk by about 15%, even while carrying a heavy load.



Managing compliance costs with quality data

Quality data should allow compliance teams to screen against a range of datasets, including sanctions, politically exposed persons (PEPs), law enforcement lists, regulatory lists, adverse media records geared towards sanctions compliance, AML, countering the financing of terrorism, and countering proliferation finance. Data should also be comprehensive; de-duplicated; consistent; accurate; configurable; and up-to-date. Given this extensive list of requirements, when choosing a data provider, it is important to choose one that has global capabilities and is able to deliver reliable and trusted data.The dataset being considered should have a highly analytical set of inclusion criteria and be de-duplicated. A successful provider should allow extensive ‘slicing and dicing’ and configuration and, perhaps most importantly of all, must take the control of operational costs seriously. It is worth mentioning that overscreening – in other words screening beyond regulatory requirements – can significantly contribute to excessive operational costs.


Can private online communities drive corporate cultural change?

speech balloons speech bubbles conversation talk social media network by comicsans getty
By creating a dedicated interactive workspace for your customers, partners and company staff, you create a collaborative space built on trust. Potential customers look for – and many times rely on – current customer interaction for recommendations and support. Through customer engagement you can educate your customers on new product enhancements and product rollouts. Public online communities are typically full of distracting noise, or communication is one-way (through reviews or isolated through FAQ’s or chatbots between the customer and the brand). Although subcultures formed by joining groups can limit noise, building brand loyalty amongst the distracting noise of public online communities is still not as safe or effective. The data can’t be gathered to increase a company’s value and effectiveness or enhance product development. In a dedicated, private online community, interactive real-time networking and tribal problem-solving helps to create long-term partnerships and friendships.


Major breach found in biometrics system used by banks, UK police and defence firms

Facial recognition technology on woman
The researchers said the sheer scale of the breach was alarming because the service is in 1.5m locations across the world and because, unlike passwords being leaked, when fingerprints are leaked, you can’t change your fingerprint. “Instead of saving a hash of the fingerprint (that can’t be reverse-engineered) they are saving people’s actual fingerprints that can be copied for malicious purposes,” the researchers said in the paper. The researchers made multiple attempts to contact Suprema before taking the paper to the Guardian late last week. Early Wednesday morning (Australian time) the vulnerability was closed, but they still have not heard back from the security firm. Suprema’s head of marketing, Andy Ahn, told the Guardian the company had taken an “in-depth evaluation” of the information provided by vpnmentor and would inform customers if there was a threat. “If there has been any definite threat on our products and/or services, we will take immediate actions and make appropriate announcements to protect our customers’ valuable businesses and assets,” Ahn said.


Here’s How Artificial Intelligence Is Fueling Climate Change

AI Apocalypse
At the moment, data centers—the enormous rooms full of stacks and stacks of servers that juggle dank memes, fire tweets, your vitally important Google docs and all the other data that is stored somewhere other than on your phone and in your home computer—use about 2% of the world’s electricity. ... According to The MIT Technology Review, Dickerson recently told a conference audience in San Francisco that—unless super-efficient semiconductors are innovated in the next five years—data centers handling AI demands could account for 10% of the world’s electricity use by 2025, a hundred-fold increase in a half-decade. Dickerson’s forecast is a worst-case scenario. Other tech execs have given estimates that vary wildly. Some think data centers, period, will suck 10% of the global electricity load. Yet others think that usage will remain relatively flat, in part because of large companies’ abilities to handle vast amounts of data in more efficient ways. Google, for example, is using AI technology to cool its data centers, reducing demand for power by 40%.


On Stocks And Machine Learning

Undoubtedly, the “cognitive biases” described by Kahneman and Tversky act on and affect the decisions of even the most experienced and famed stock analysts and portfolio managers in the world. Specifically, the “Confirmation Bias” may lead analysts to purchase stocks that are well- known, popular and “juicy”. Analysts are usually “swamped” with information and data on the companies they follow which might raise their confidential level in their analysis of these companies’ stocks. The “Anchoring Bias” will make it difficult for the analyst to sell a stock that he purchased even if he discovers that he had erred in his original analysis of this stock’s performance. The “Representational Bias” may also lead the analysts to wrong investments. The problem related to the “Representational Bias” stems from the tendency of the analyst, when investigating the history and profile of the company, to assume that these parameters will repeat themselves in the future. This assumption ignores the “reversion to the mean” phenomenon which is typical for the finance market and the economic market in general.


Data management roles: Data architect vs. data engineer, others


How do these data management roles compare? Data architects design and help implement database systems and other repositories for corporate data, Bowers said. They're also responsible for ensuring that organizations comply with internal and external regulations on data, and for evaluating and recommending new technologies, he added. Bowers described a data architect as a "know-it-all" who has to be familiar with different databases and other data management tools, as well as use cases, technology costs and limitations, and industry trends. "I had to master a ton of technologies to become a data architect," he said. A data modeler identifies business rules and entities in data sets and designs data models for databases and other systems to help reduce data redundancy and improve data integration, according to Bowers. Data modelers make less money on average than many other IT workers, but you get what you pay for, he cautioned.


Data Management No Longer an IT Issue

The next-generation data management platform needs to treat data differently. It needs to see data as a liquid core asset - not a static one -- that can be quickly ingested, stored in the most appropriate data formats and locations, and easily accessed by any analytical processing engine. The data architecture should be flexible, scalable, high-performance, integrated, and secure. But this does not mean you need to create an entirely new enterprise data platform, according to Han. "The core components are still the same - applications, middleware, database, analytics, and systems. However, when we build the new data architecture on top of the existing framework, we must be aware that there are new access points like mobile and IoT for collecting data today, which did not exist 15 years ago. There is also a huge abundance of data that comes in a variety of formats today. So, the question is, how can we integrate them all?" Oracle’s Big Data SQL, an end-to-end big data and AI platform, looks at all data in unison and integrates them to maximize its value.



Quote for the day:


"Leadership is, among other things, the ability to inflict pain and get away with it - short-term pain for long-term gain." -- George Will


Daily Tech Digest - August 17, 2019

Security warning for software developers: You are now prime targets for phishing attacks


According to the Glasswall report, software developer is the role most targeted by hackers going after the technology sector. A key reason for this is that devs do the groundwork on building software and will often have administrator privileges across various systems. That's something attackers can exploit to move laterally around networks and gain access to their end goal. "As an attacker, if you can land on an administrator machine, they have privileged access and that's what the attackers are after. Software developers do have that privileged access to IP and that makes them interesting," Lewis Henderson, VP at Glasswall, told ZDNet. With software developers being technically-savvy people, some might argue that they shouldn't easily fall victim to phishing campaigns. But attackers can use specially-crafted messages to target one individual in the organisation they want to gain access to. With software developers often staying in jobs for relatively short periods of time, it's common for those in the profession to build a profile on professional social networks such as LinkedIn. Attackers can exploit that to find out the specific skills and interests of their would-be victim and tailor a spear-phishing email towards them.



Deploying Natural Language Processing for Product Reviews

We have data all around us and there are of two forms of data namely; tabular and text. If you have good statistical tools tabular data has a lot to convey. But it is really hard to get something out of the text, especially the natural language spoken text. So what is natural language? We, humans, have very complex language and natural language is the true form of human language which is spoken/written with sincerity also surpassing grammatical rules. To consider the best example where you can find this language is in “Reviews”. You write review mainly for two reasons, either you are very happy with the product or very disappointed with it and, with your reviews and a Machine Learning Algorithm, entities like Amazon can figure out whether the product they are selling is good or bad. Depending upon the results on the analysis of the reviews they can make further decisions on that product for their betterment.


Scrum is not magic and will not solve this problem. If you do not have enough skills to do the work or do a great job in the work, then it will not magically create those skills. What it will do is make that problem very evident in the Increment (stuff that is delivered), the Sprint Review, the Retrospective, Sprint Planning and the Daily Scrum. Actually, it will be evident in all of the Scrum events. Scrum might not be magic, but it does make problems very evident, encouraging the team to solve them. Skills are one set of challenges that teams face and Scrum will make them, or the lack of them very apparent to everyone. This will, however mean choices need to be made by the team and the management of the environment the team works within. There is no blaming the system with Scrum. Many teams doing Scrum describe the sensation of being on a Scrum Team like being in a startup. It is rare that a startup has all the right skills to deliver the best product, but they have enough to do something and will beg, borrow and steal the knowledge and experience to fill in the gaps.



Fintech - Regtech - How About Sales?

The good news is that compelling events such as a growing demand for regulatory compliance and digitalization are triggering and driving many new procurement initiatives within the financial institutions. The bad news is that purchasers, influencers and decision makers get overloaded with requests for meetings and presentations by numerous candidate suppliers. The apparent conflict between the interests of young technology companies and the overloaded and stressed end-user prospects and clients, resulted in the emergence of a new type of business: the technology brokerage or in other words: companies providing shared expert sales and account management services, on an international scale. With this new model, working with the rare species of expert financial technology sales becomes affordable for the technology company. At the same time the end-users can interact with a trusted but independent account manager that interfaces with different technology providers.


The history of AR and VR: from gimmick to business problem solver

The history of AR and VR: from gimmick to business problem solver image
The history of AR and VR goes back longer than anyone would have expected. When Charles Wheatstone invented the stereoscope in 1838, he didn’t know it, but his 3D image creation would spark the augmented reality and virtual reality boom that is predicted to infiltrate business and society in the next 10-15 years. While the first VR head-mounted display (HMD) was created in 1968 by computer scientist Ivan Sutherland, “there was no name for AR when we started in 2011,” says Beck Besecker, CEO, Marxent. “We called it hologram technology at the time.” ... Both technologies were viewed as quite gimmicky add ons, until opportunities emerged to apply them to tangible use cases, such as in the home vertical. But what changed? Did the technologies advance enough to add value? Or, did awareness around the benefits of the technologies improve? There’s a bunch of reasons. And, one of the main ones, is getting over the hype — the stumbling block for many emerging technologies.
Get ready for the convergence of IT and OT networking and security
Traditionally, IT and OT have had very separate roles in an organization. IT is typically tasked with moving data between computers and humans, whereas OT is tasked with moving data between “things,” such as sensors, actuators, smart machines, and other devices to enhance manufacturing and industrial processes. Not only were the roles for IT and OT completely separate, but their technologies and networks were, too. That’s changing, however, as companies want to collect telemetry data from the OT side to drive analytics and business processes on the IT side. The lines between the two sides are blurring, and this has big implications for IT networking and security teams. “This convergence of IT and OT systems is absolutely on the increase, and it's especially affecting the industries that are in the business of producing things, whatever those things happen to be,” according to Jeff Hussey, CEO of Tempered Networks, which is working to help bridge the gap between the two. “There are devices on the OT side that are increasingly networked but without any security to those networks. Their operators historically relied on an air gap between the networks of devices, but those gaps no longer exist. ..."



The true value of diversity in risk management


Looking beyond gender diversity, Molyneux, Omero, Reis, A. Merzouk, and Lani Bannach, Director of Essenta and Well U Trading, advocate for diverse teams but in a multidisciplinary way. Molyneux believes that “diversity, in all forms, is incredibly important for every business or sector. When I say “all forms”, I would even include things like cultural diversity, diversity in the level of experience, and even diversity in operating styles.” “There are several studies where a diverse workforce is proven to enrich the working environment by providing different solutions to the same problem and by opening up constructive debate, ultimately resulting in a better outcome. Companies that do not diversify lose out on competitiveness and talent”, Omero explained. “If the sector doesn’t value and embrace diversity appropriately it will lose a powerful taskforce and source of knowledge and creativity”, Reis added. “The sector is always open to new ideas and innovative solutions for old and new issues. The more diverse an environment is, the more creative and revolutionary will the business solutions be.”


Testing Microservices: Overview of 12 Useful Techniques - Part 1

Choose your testing techniques with a perspective on time to market, cost, and risk. When testing monoliths with techniques like service virtualization, you do not have to test everything together. You can instead divide and conquer, and test individual modules or coherent groups of components. You create safe and isolated environments for developers to test their work. ... When working with microservices, you have more options because microservices are deployed typically in environments that use containers like Docker. In microservice architectures, your teams are likely to use a wider variety of testing techniques. Also, since microservices communicate more over the wire, you need to test the impact of network connections more thoroughly. Using tools and techniques that better fit the new architecture can allow for faster time to market, less cost, and less risk.  Many IT departments work with or maintain systems developed and deployed in a monolithic architecture.


Flip the ratio: Taking IT from bottleneck to battle ready


One of the main reasons back-end systems demand so many resources is that they do not take advantage of agile ways of working that have become second nature to most software developers. Either back-end teams confuse “doing” agile rather than actually “being” agile, running waterfall projects using the scrum method but not working in small teams rapidly iterating on small chunks of code, or agile doesn’t even make it to the back-end teams. Even application maintenance and IT infrastructure can benefit from agile principles, which is significant, since these areas often make up 40 to 60 percent of the IT organization. By introducing true agile methods—small, cross-functional teams or squads working in rapid iterations—to relevant enterprise IT work, companies can radically reduce the resources needed to support those systems while substantially improving service quality and the potential for automation. ... By better understanding business needs, teams eliminated some demand by providing self-service options. Cross-functional teams had the people needed to not only identify the root cause of incidents but correct them immediately.


IoT Devices — Why Risk Assessment is Critical to Cybersecurity

IoT Devices cybersecurity risk assessment
Managing risk of any kind, and IoT risk, in particular, is never a one-and-done exercise. After first determining the risk category for new IoT devices or services, it is crucial to revisit this exercise on a regular basis. Changes to the IoT devices, the local area networks and the applications with which the devices interact create an ever-changing attack surface that requires constant monitoring to help maintain a strong forward-leaning security posture. Organizations should take a disciplined approach to risk categorization and mitigation across the entire IoT ecosystem. Tripwire can help you identify IoT risks by providing rigorous security assessments. Tripwire’s device testing approach includes identifying security risks and vulnerabilities that may exist in the physical construction of the device and its network interfaces. Our goal is to identify potential control exposures through security configuration analysis and vulnerability testing of the platform and the operating environment.



Quote for the day:


"There is no "one" way to be a perfect leader, but there are a million ways to be a good one." -- Mark W. Boyer


Daily Tech Digest - August 13, 2019

What is instant recovery? A way to quickly restore lost files and test backup systems

CSO > Microsoft Azure backups / cloud computing / binary code / data transfer
The first challenge is that the hypervisor is not really reading a VMDK image; it is reading a virtual image being presented to it by the backup product. Depending on which product you're using and which version of the backup you chose, the backup system may have to do quite a bit of work to present this virtual image. This is why most backup systems recommend limiting the number of instant booted images at a time if performance is important. The second reason instant recovery is not typically high-performance is that the VMDK is on secondary storage. In a world where many primary systems have gone to all-flash arrays, today's backup systems still use SATA, which is much slower. The final enemy of high-performance in an instant-recovery system is that many backups are stored in a deduplicated format. Presenting the deduplicated files as a full image takes quite a bit of processing power and again takes away from the performance of the system. Some deduplication systems can store the most recent copy in an un-deduplicated fashion making them much faster for an instant-recovery set up.



Pair Programming (PP) is an extreme programming approach to produce better software where two people work together at one computer and work is reviewed as it is done. The driver operates the keyboard while navigator is watching, asking questions, guiding, reviewing, learning and making suggestions. Find more about PP at Wikipedia. We often hear that Pair Programming is a “waste of time”, “doesn’t really work”, “suppresses creativity”, “kills privacy”, “stressful”, etc., These are all genuine concerns any team may have based on their circumstances and experience. ...PP helps in transitioning the knowledge and works great when you have new members on the team. Navigator plays a contributor role while the driver is the receiver. This approach indirectly reduces the training cost of the new members. Team members with heavy knowledge of the project tend to have more dependency, as they are knowledge-towers. It is always a good idea to spread that knowledge to others to reduce the dependency of those people. When these heavy-lifters pair with others, it helps to spread the knowledge easily.


8 features all enterprise SaaS applications must have


Reliability and security are two of the most important qualities for SaaS tools. Companies that run their software on premises are able to store corporate information in their own infrastructures, which helps them keep that sensitive data secure. However, when it comes to SaaS, the software providers are responsible for keeping user data safe. Consequently, it makes sense that security and data privacy are key capabilities in enterprise SaaS applications. Providers should also include features in their enterprise SaaS offerings that solve business issues and provide the availability and efficiency that are necessary in an increasingly challenging enterprise environment. There is little doubt that companies are looking into SaaS -- usually, in a multi-tenant model in which users from different organizations share the same instance of an application. SaaS is arguably the purest form of the cloud and the largest segment of the cloud market, with revenue expected to grow 22.2% to reach $73.6 billion this year, according to Gartner. In addition, SaaS is expected to reach 45% of total application software spending by 2021.


What Microsoft's upcoming 'outsourcing' licensing changes could mean

Microsoft's upcoming licensing change is going to be "massive" for customers who've been using AWS and Google Cloud dedicated hosts to run Windows Server and Windows client, says Directions on Microsoft's Miller. "Why? Those products never offered -- and still don't offer -- License Mobility through Software Assurance," he said.  Microsoft officials note that beginning October 1 "on-premises licenses purchased without Software Assurance and mobility rights cannot be deployed with dedicated hosted cloud services offered by the following public cloud providers: Microsoft, Alibaba, Amazon, and Google. They will be referred to as 'Listed Providers.'" On October 1, customers who already are running Microsoft on-premises software offerings from these listed providers will be able to continue to deploy and use Microsoft enterprise software under their existing licenses. But they won't be able to add workloads or upgrade to a new product version released after October 1 under their existing licenses.


How to implement edge computing

edge-computing.jpg
"Networking skills are important at the edge because you need highly skilled people who can make the decisions, such as whether they want to deploy one large network or a series of smaller, specialized networks," said Coufal. "These same network architects need to make decisions about which of their different networks under management should be federated with each other for information exchange and which they want to keep separate. In many cases, business security and information exchange requirements will dictate this." Coufal recommends that organizations take a measured approach when it comes to deploying computing at the edges of their enterprises. "This means pushing out portions of applications to the edges of your company, but not necessarily everything," he said. "You can always plan to scale out later." It's also important to place an emphasis on the security that will be needed at the edge, given that end user personnel, not necessarily IT, will be running and maintaining much of this edge computing. Finally, bandwidth is an issue. If you can place subsets of your data and your applications at the edge, the processing of data, as well as the data that is transmitted from point to point, will be faster.


A New Credential for Healthcare Security Leaders

The Certified Healthcare Information Security Leader - or CHISL - credential was created by the Association of Executives in Healthcare Information Security, a subgroup of the College of Healthcare Information Management Executives. "There are a number of security certification programs, but they are not tailored to the healthcare environment," Marsh says in an interview with Information Security Media Group. The new certification is "sculpted" for healthcare security leaders, he says. In its statement about the new credential, CHIME notes that it's modeled after the organization's Certified Healthcare CIO, or CHCIO, certification program, which is exclusively for healthcare CIOs. To earn the CHISL designation, a security executive will need to pass an exam that tests knowledge of seven domains: organizational vision and strategy; technology proficiency; change management; value assessment and management; service management; talent management; and management of security relationships.


7 trends impacting commercial and industrial IoT data

IoT-and-Computer-Networking.png
According to Gartner, within the next four years, 75% of enterprise-generated data will be processed at the edge (versus the cloud), up from <10% today. The move to the edge will be driven not only by the vast increase in data, but also the need for higher fidelity analysis, lower latency requirements, security issues and huge cost advantages. While the cloud is a good place to store data and train machine learning models, it cannot deliver high fidelity real-time streaming data analysis. In contrast, edge technology can analyze all raw data and deliver the highest-fidelity analytics, and increase the likelihood of detecting anomalies, enabling immediate reaction. A test of success will be the amount of “power” or compute capability that can be achieved in the smallest footprint possible. ... The CEP function should enable real-time, actionable analytics onsite at the industrial edge, with a user experience optimized for fast remediation by operational technology (OT) personnel. It also prepares the data for optimal ML/AI performance, generating the highest quality predictive insights to drive asset performance and process improvements.


3gpp-network-slicing-architecture-image03.jpg
"Think of 5G and network slicing. That's a can of worms!" remarked Dr. Gerhard P. Fettweis, coordinator of Germany's 5G Lab and a professor at Technische Universität Dresden. "How are you going to handle all this from an integrity, privacy, security [standpoint], knowing that your hardware is not going to be fail-proof -- because two years from now, we're going to have four major updates of the system, because we found out somebody could've been malfunctioning the system?" It isn't that AT&T, Verizon, and the successor company to the T-Mobile and Sprint merger have some suppressed, nascent desire to go into competition against Amazon, Microsoft Azure, and Google Cloud. But they may be reselling cloud capacity to companies large and small that could certainly disrupt the cloud providers' best-laid plans. These would include many of the cloud providers' largest enterprise customers, who may be willing to spend premiums on operating their own global, fiber optic cable-linked networks as though they were their own data centers.


Psychometric tests are a key weapon in battle against cyber security breaches

Cyberchology: psychometric tests are a key weapon in battle against cyber security breaches image
Phishing attacks are less likely to be effective if they are targeted at people with a preference for sensing. On the other hand, people with these personalities are more likely to take cyber security risks. There is a nuance here. It turns out that the cyber security risk takers are more likely to be people in this group who have a “preference for Perceiving and/or Extraversion. As for people who have a preference for feeling or judging, they “are more likely to fall victim to social engineering attacks than those with a preference for Thinking. But they also. tend to be more cautious and therefore more rigorous when following cyber security policies. However, the ‘Thinking’ group can over-estimate their own competence, leading to mistakes. The ESET and The Myers-Briggs Company Cyberchology report suggests that psychometric tests can be used to build self-awareness, thereby reducing vulnerability to potential cyber security breaches.


Empathy is a Technical Skill

Archeology and anthropology can give us good metaphors for what it’s like to work with software that we didn’t write ourselves. If you’re attempting to reconstruct someone else’s viewpoint, but you don’t have direct access to them, you’ll need to rely on two critical components: artifacts and context. The same applies to software. In a legacy system, we often don’t have access to the developers who initially wrote the code. So instead, we need to look at what they’ve left behind — their artifacts. Just like how pottery, skeletons, coins, foundations of buildings, and writing can help us figure out what someone’s life was like in the distant past, we can use those principles in software, too. The question to ask as you’re going about your daily work is, "Am I leaving durable evidence of my thinking that will help someone in the future?" That might be someone else after you’ve left for another role, or it could be your future self six months from now after you’ve forgotten the details of what you were working on.



Quote for the day:


"A simple but powerful rule: always give people more than what they expect to get." -- Nelson Boswel