Daily Tech Digest - July 29, 2020

When ‘quick wins’ in data science add up to a long fail

The nature of the quick win is that it does not require any significant overhaul of business processes. That’s what makes it quick. But a consequence of this is that the quick win will not result in a different way of doing business. People will be doing the same things they’ve always done, but perhaps a little better. For example, suppose Bob has been operating a successful chain of lemonade stands. Bob opens a stand, sells some lemonade, and eventually picks the next location to open. Now suppose that Bob hires a data scientist named Alice. For their quick win project, Alice decides to use data science models to identify the best locations for opening lemonade stands. Alice does a great job, Bob uses her results to choose new locations, and the business sees a healthy boost in profit. What could possibly be the problem? Notice that nothing in the day-to-day operations of the lemonade stands has changed as a result of Alice’s work. Although she’s demonstrated some of the value of data science, an employee of the lemonade stand business wouldn’t necessarily notice any changes. It’s not as if she’s optimized their supply chain, or modified how they interact with customers, or customized the lemonade recipe for specific neighborhoods.


How New Hardware Can Drastically Reduce the Power Consumption of Artificial Intelligence

Currently, AI calculations are mainly performed on graphical processors (GPUs). These processors are not specially designed for this kind of calculations, but their architecture turned out to be well suited for it. Due to the wide availability of GPUs, neural networks took off. In recent years, processors have also been developed to specifically accelerate AI calculations (such as Google’s Tensor Processing Units – TPUs). These processors can perform more calculations per second than GPUs, while consuming the same amount of energy. Other systems, on the other hand, use FPGAs, which consume less energy but also calculate much less quickly. If you compare the ratio between calculation speed and energy consumption, the ASIC, a competitor of the FPGA, scores best. Figure 1 compares the speed and energy consumption of different components. The ratio between both, the energy performance, is expressed in TOPS/W (tera operations per second per Watt or the number of trillion calculations you can perform per unit of energy). However, in order to drastically increase energy efficiency from 1 TOPS/W to 10 000 TOPS/W, completely new technology is needed.


Maintaining Business Continuity With Proper IT Infrastructure And Security Tools A Challenge For IT Pros

Business continuity plans are integral to companies’ ability to withstand an unanticipated crisis. 86% of companies have a business continuity plan in place prior to COVID-19, 12% of respondents have minimal or no confidence at all in their organization’s plan to withstand an unanticipated crisis; only 35% of respondents feel very confident in their plan, according to the LogicMonitor study.  IT decision makers also expressed overall reservations about their IT infrastructure’s resilience in the face of a crisis. Globally, only 36% of IT decision makers feel that their infrastructure is very prepared to withstand a crisis. And while a majority of respondents (53%) are at least somewhat prepared to take on an unexpected IT emergency, 11% feel they are minimally prepared or believe their infrastructure will collapse under pressure. 84% of global IT leaders are responsible for ensuring their customers’ digital experience, but nearly two-thirds (61%) do not have high confidence in their ability to do so, according to the LogicMonitor’s study. The study further revealed more than half (54%) of IT leaders experienced initial IT disruptions or outages with their existing software, productivity, or collaboration tools as a result of shifting to remote work in the first half of 2020.


Why blockchain-powered companies should target a niche audience

The main problem lies in the fact that blockchain technology has a vast array of potential applications. So it’s all too easy to have an overly broad value proposition. But this creates a lack of clarity and precision, which can, in turn, drive customers away. A generic ‘go-to’ marketing strategy for a tech company often fails to take into account how the adoption of new technology works. This is especially the case with blockchain-powered companies because they usually involve complex partnerships with financial institutions. When marketing a technology company, the focus on technology potential often distracts from a solid Minimum Viable Product (MVP), and so results in a generic go-to-market strategy. It’s a case of what I call ‘CTO-led startup’, a scenario where the founder may have incredibly deep technological skill and knowledge, but forgets that they need to wear two hats: that of a tech builder, and that of a visionary CEO. As an ‘enabling technology’, the CEO of a blockchain-dependant business can potentially have a near unlimited vision for the company. So it can feel counterintuitive to zero in on a singular laser focus when building the brand, because it superficially seems like a reductive strategy when compared to the broader vision.


Why Data Science Isn't an Exact Science

In fact, there are several reasons why data science isn't an exact science, some of which are described below. "When we're doing data science effectively, we're using statistics to model the real world, and it's not clear that the statistical models we develop accurately describe what's going on in the real world," said Ben Moseley, associate professor of operations research at Carnegie Mellon University's Tepper School of Business. "We might define some probability distribution, but it isn't even clear the world acts according to some probability distribution." ... If you lack some of the data you need, then the results will be inaccurate because the data doesn't accurately represent what you're trying to measure. You may be able to get the data from an external source but bear in mind that third-party data may also suffer from quality problems. A current example is COVID-19 data, which is recorded and reported differently by different sources. "If you don't give me good data, it doesn't matter how much of that data you give me. I'm never going to extract what you want out of it," said Moseley.


Artificial Intelligence Loses Some Of Its Edginess, But Is Poised To Take Off

“It appears that AI’s early adopter phase is ending; the market is now moving into the ‘early majority’ chapter of this maturing set of technologies,” write Beena Ammanath, David Jarvis and Susanne Hupfer, all with Deloitte, in their most recent analysis of the enterprise AI space. “Early-mover advantage may fade soon. As adoption becomes ubiquitous, AI-powered organizations may have to work harder to maintain an edge over their industry peers.” ... “This could mean that companies are using AI for IT-related applications such as analyzing IT infrastructure for anomalies, automating repetitive maintenance tasks, or guiding the work of technical support teams,” Ammanath and her co-authors note. Tellingly, business functions such as marketing, human resources, legal, and procurement ranked at the bottom of the list of AI-driven functions. An area that needs work is finding or preparing individuals to work with AI systems. Fewer than half of executives (45%) say they have “a high level of skill around integrating AI technology into their existing IT environments,” the survey shows. 



DevOps engineers: Common misconceptions about the role

Rather than planning to evolve the role of DevOps engineer, identify those people within the IT team — those who’ve been in development, architecture, system engineering, or operations for a few years but who also have the soft skills needed to both pitch ideas and deliver on them. DevOps engineers should focus on problem-solving skills and on their ability to increase efficiency, save time, and automate manual processes – and above all, to care about those who use their deliverables. Workplace disruption has happened. Communication has proven to be sometimes challenging in our virtual world. Projects stalled by this disruption must be restarted. There is already a skills and gender gap within IT. The DevOps engineer must become one of what the DevOps Institute calls “the Humans of DevOps.” These are engineers who have people skills along with process and technology skills. Learning among team members as well as within the enterprise is paramount, and there has never been a better time to do it. Consider whether a DevOps engineer has the soft skills to facilitate the learning of team members and to continuously transform the team according to the needs of the business.


The Hacker Battle for Home Routers

Trend Micro says that four years after the Mirai botnet, the landscape is more competitive than ever. "Ordinary internet users have no idea that this war is happening inside their own homes and how it is affecting them, which makes this issue all the more concerning," according to a new Trend Micro report, which was co-authored by Stephen Hilt, Fernando Mercês, Mayra Rosario and David Sancho. Botnet code running on a device can diminish bandwidth. It could also mean connectivity problems. If security solutions flag a device as being part of a botnet, certain services may be inaccessible. At worst, if a router is being used as a proxy for crime, the owner of the device could be blamed. With many workers still working from home during the pandemic, there's also a worry about how such infections could potentially affect enterprises as well. Throughout 2019 and into this year, Trend Micro says, its telemetry detected a rising number of brute-force attempts to infect routers, which involve trying various combinations of login credentials. The company suspects the attempts came from other routers.


The 'magic' of open source: better, faster, cheaper -- and trustworthy

Although open-source software has been available for decades, governments at all levels are seeing the benefits of embracing it to better deliver services to the public, a new report states. Those benefits include improving efficiency, lowering costs, improving trust, increasing transparency and reducing vendor lock-in, according to “Building and Reusing Open Source Tools for Government,” released this month by think tank New America. What’s more, open source allows for collaboration so that government entities with common problems don’t have to reinvent the wheel to solve them. For instance, the United Kingdom’s Government Digital Service’s Notify communications management platform is available as open source, and the government of Canada adapted it last year to fit its own needs, such as modifying it to support multiple languages, the report states. In California, the Government Operations Agency tasked a team to rethink how residents access information online. One of the 20-odd prototypes developed was used by another team to stand up an unemployment insurance application within Covid19.ca.gov, a website created to provide pandemic information, said Angelica Quirarte


COVID-19 has disrupted cybersecurity, too – here's how businesses can decrease their risk

To eep enterprises running, businesses must secure remote access and collaboration services, step up anti-phishing efforts and strengthen business continuity. Businesses need to establish a culture of robust cyber hygiene, by providing resources to the workforce and managing access and monitoring activity on critical assets. ... Not all organisations understand their security posture and the effectiveness of security controls. As a result, they don’t make the right decisions or prioritise the correct actions, which leaves the enterprise open to attack and compromise. Securing end users, data and brand is the next priority. As the number of cybersecurity threats has increased, chief security officers and their teams are also benefiting from an increase in prioritisation. Budget rebalancing will be inevitable as other projects are put on hold to safeguard organisations and invest more in security. Cybersecurity strategists should now think longer term, about the security of their processes and architectures. They should prioritise, adopt and accelerate the execution of critical projects like Zero Trust, Software Defined Security, Secure Access Service Edge (SASE) and Identity and Access Management (IAM) as well as automation to improve the security of remote users, devices and data.



Quote for the day:

"I am more afraid of an army of one hundred sheep led by a lion than an army of one hundred lions led by a sheep." -- Charles Maurice

Daily Tech Digest - July 28, 2020

The 6 Biggest Technology Trends In Accounting And Finance

When the internet of things, the system of interconnected devices and machines, combines with artificial intelligence, the result is the intelligence of things. These items can communicate and operate without human intervention and offer many advantages for accounting systems and finance professionals. The intelligence of things helps finance professionals track ledgers, transactions, and other records in real-time. With the support of artificial intelligence, patterns can be identified, or issues can be resolved quickly.  ... Robots don't have to be physical entities. In accounting and finance, robotic process automation (RPA) can handle repetitive and time-consuming tasks such as document analysis and processing, which is abundant in any accounting department. Freed up from these mundane tasks, accountants are able to spend time on strategy and advisory work. Intelligent automation (IA) is capable of mimicking human interaction and can even understand inferred meaning in client communication and adapt to an activity based on historical data. In addition, drones and unmanned aerial vehicles can even be deployed on appraisals and the like.


Transportation takes a leading edge with smart technology

As airports and aircraft become digitally connected through Edge IoT technology, many potential opportunities to improve air travel become an everyday reality. By harnessing Edge technology, 5G, and computer vision, many airlines are now able to drive significant operational efficiency. There are many use cases here, including: visual inspection-based pre-emptive maintenance that reduces downtime and delays, smarter scheduling and runway utilization, and cost-savings through smarter fuel usage. Safety and security can be significantly enhanced through Edge computing. Combining computer vision, computer audition, and analytics at the Edge can facilitate less disruptive and more rigorous safety and security. For example, facial recognition can be employed at smart gates to help tackle crime, and smart technology can be used to improve health screenings at airports. And there is huge potential for improving customer experience. By using Edge computing and smart technologies, the whole passenger journey can be connected and made smoother; from parking and arrival at the airport, through check-in, boarding, and inflight entertainment to arrival and baggage claim.


Attackers Exploiting High-Severity Network Security Flaw, Cisco Warns

The flaw specifically exists in the web services interface of Firepower Threat Defense (FTD) software, which is part of Cisco’s suite of network security and traffic management products; and its Adaptive Security Appliance (ASA) software, the operating system for its family of ASA corporate network security devices. The potential threat surface is vast: Researchers with Rapid7 recently found 85,000 internet-accessible ASA/FTD devices. Worse, 398 of those are spread across 17 percent of the Fortune 500, researchers said. The flaw stems from a lack of proper input validation of URLs in HTTP requests processed by affected devices. Specifically, the flaw allows attackers to conduct directory traversal attacks, which is an HTTP attack enabling bad actors to access restricted directories and execute commands outside of the web server’s root directory. Soon after patches were released, proof-of-concept (POC) exploit code was released Wednesday for the flaw by security researcher Ahmed Aboul-Ela. A potential attacker can view more sensitive files within the web services file system: The web services files may have information such as WebVPN configuration, bookmarks, web cookies, partial web content and HTTP URLs.


Scrum’s Nature: It Is a Tool; It Is Not About Love or Hate 

The question then is: Why would I “hate” a tool unsuited for the intended purpose or applied incompetently? Would I hate a hammer for not being capable of accurately driving a screw into a wooden beam? Probably not, as the hammer wasn’t designed for that purpose, and neither sheer will-power nor stamping with your feet will change the fact. ... The job of the Scrum Master is hence to support the Scrum team by removing impediments—problems the team members cannot solve by themselves-thus supporting this decentralized leadership approach. Moreover, those impediments are mostly situated at an organizational level. Here, change is not happening by simply “getting things done,” but by working with other stakeholders and their plans, agendas, objectives, etc. ... Agile software development is not about solving (code) puzzles all day long. As a part of creating new products in complex environments, it is first-of-all about identifying which problems are worth solving from a customer perspective. Once that is established, and Scrum’s empirical approach has proven to be supportive in that respect, we strive to solve these puzzles with as little code as possible.


Dave: Mobile Banking App Breach Exposes 3 Million Accounts

Dave says the breach traces to the Waydev analytics platform for engineering teams that it formerly used. "As the result of a breach at Waydev, one of Dave's former third-party service providers, a malicious party recently gained unauthorized access to certain user data at Dave, including user passwords that were stored in hashed form using bcrypt, an industry-recognized hashing algorithm," Dave says in its Saturday data breach notification. Waydev, which is based in San Francisco, first warned on July 2 that its service may have been breached. "We learned from one of our trial environment users about an unauthorized use of their GitHub OAuth token," Waydev says in a data breach notification posted on its site that details security measures it recommends all users take. "The security of your data is our highest priority. Therefore, as a precautionary measure to protect your account, we revoked all GitHub OAuth tokens." Beyond that notice, "we notified the potentially affected users" directly, Waydev's Mike Dums tells Information Security Media Group. The company says that it immediately hired a third-party cybersecurity firm, Bit Sentinel to help investigate the intrusion and lock down its environment, including having now fixed the vulnerability exploited by attackers.


Intelligent ways to tackle cyber attack

Absalom recommends that security practitioners balance the need for human oversight with the confidence to allow AI-supported controls to act autonomously and effectively. He says: “Such confidence will take time to develop, just as it will take time for practitioners to learn how best to work with intelligent systems.”  Given time to develop and learn together, Absalom believes the combination of human and artificial intelligence should become a valuable component of an organisation’s cyber defences. As Morris points out, fraud management, SIEM, network traffic detection and endpoint detection all make use of learning algorithms to identify suspicious activity – based on previous usage data and shared pattern recognition – to establish “normal” patterns of use and flag outliers as potentially posing a risk to the organisation. For companies with a relatively small and/or simple IT infrastructure, Wenham argues that the cost of an AI-enabled SIEM would probably be prohibitive while offering little or no advantage when coupled with good security hygiene. On the other hand, for an enterprise with a large and complex IT infrastructure, Wenham says the cost of an AI-enabled SIEM might well be justified. 


Are newer medical IoT devices less secure than old ones?

Mularski does concede that some particularly vulnerable old devices are often more isolated on the network by design, in part because they’re more recognizable as vulnerable assets. Windows 95-vintage x-ray machines, for example, are easy to spot as a potential target for a bad actor. “For the most part, I think most of the hospital environments, they do a good job at recognizing that they have these old deices, and the ones that are more vulnerable,” he said. This underlines a topic most experts on – simple awareness of the potential security flaws on a given network are central to securing healthcare networks. Greg Murphy is the CEO of Ordr, a network visibility and security startup based in Santa Clara. He said that both Mularski and Staynings have points in their favor. “Anyone who minimizes the issue of legacy devices needs to walk a mile in the shoes of the biomedical engineering department at a hospital,” he said. “[But] on the flipside, new devices that are being connected to the network have huge vulnerabilities themselves. Many manufacturers themselves don’t know what vulnerabilities their devices have.”


The Opportunity in App Modernization

Domain Driven Design and modeling techniques like SWIFT, Wardley Maps, Bounded Context Canvas have provided the technical and business heuristics in carving out microservices. There is however an emerging backlash from the complexity of microservices and an impetus to move towards simpler deployment architectures like modular monoliths. See To Microservices and Back Again. There are significant gaps that libraries and frameworks can fill by driving a backlog of stories and implementation from event storming or monolith decomposition. Generating a backlog of user stories and epics from event storming is a work of art and requires heavy facilitation because DDD Is Broken. Dividing business capabilities in sub-business capabilities is tough and candidate microservices need expert judgment before implementation. Observability tools and frameworks that aid in understanding the existing application runtime metadata paired with a profiler theoretically have the information needed to make recommendations on starting points to decomposing monoliths. A tool that has started to look at this problem is vFunction.


Ten ‘antipatterns’ that are derailing technology transformations

One of the biggest sources of impact in technology transformations comes from simplifying the path to production, the steps involved from defining requirements to releasing software and using it with disciplined repetition across teams. This requires a lot of organizational and executive patience, as the impacted teams—app development, operations, security, support—can take weeks and months to perfect this coordinated dance. Tools and architecture changes can help, but to be effective, they need to be paired with changes to engineering practices, processes, and behaviors. Launching programs for large architecture and tooling changes often requires minimal effort, catches the executive and board’s fancy, and represents that things are moving. However, in our experience, without changes to engineering practices, processes, and behaviors, such programs have minimal or no impact. ... After months of futile top-down incentives and nudges for tools adoption, the bank refocused on how the tools enabled a new set of engineering practices and collaboration between teams. It showed how the new tools could simplify the path to production. 


Is Robotic Process Automation As Promising As It Looks?

RPA works best when application interfaces are static, procedures don’t change, and data patterns stay stable – a mix that is progressively uncommon in today’s dynamic, digital scenario. The issue with RPA, in any case, isn’t that the tools aren’t clever enough. Rather, its main challenge is progressively about strength –handling the unexpected sudden changes in the IT world. Adding cognitive abilities to RPA doesn’t resolve these strength issues – you essentially end up with more intelligent technology that is still similarly as weak it was in the past. RPA is still in the phase of advancement, thus it can introduce difficulties that may bring about undesirable results. Consequently, it is difficult for associations to decide whether they ought to put their resources into robotic automation or wait until its extension. A far-reaching business model must be created while thinking about the implementation of this technology; else, it will be futile if returns are just marginal, which may not be worth taking the risk. RPA is equipped for dealing with specific tasks and assignments, however isn’t planned to deal with processes. Therefore, it appears to be legitimate to believe that combined with other more specific instruments, it can drive better execution.



Quote for the day:

"Leaders must encourage their organizations to dance to forms of music yet to be heard." -- Warren G. Bennis

Daily Tech Digest - July 27, 2020

DevOps: 5 things teams need from CIOs

To keep up with the pace of software and app releases, your developers and product teams need the ability to automate different test scenarios quickly, continuously, and in real-time. Your teams do not have months and weeks to test, analyze, and update code before a new release. Investing in the tools they need to migrate to more modern platforms gives teams the flexibility they need to meet demand. As convenient and trusted as legacy systems are, if you are serious about DevOps, updating your legacy systems and architecture should be a primary focus. This is especially important as technologies like artificial intelligence, augmented reality, and virtual reality gain momentum and popularity. When planning budgets into the next year, consider designating resources to replace these legacy systems. ... Ensure that each team works well on its own before you have teams work together. For different teams to work together successfully, the individuals on each team must be able to work with each other. Make sure that development personnel attends all relevant meetings and discussions with operations/IT teams, and vice versa. Listen. Concentrate on what your team members are communicating. Be mindful; do not take a passive approach or focus only on your response. 


The 4 essential pillars of cloud security

One of the key constructs of zero-trust computing is continuous improvement. An effective cloud security solution should enable ongoing insight into the entire cloud environment, thereby creating the opportunity for ongoing improvement. ... The second pillar involves providing security for end systems, managed services or different workloads running inside the cloud – commonly called platform as a service. This compute-level security has two key components. First is automated vulnerability management, which identifies and prevents vulnerabilities across the entire application lifecycle while prioritizing risk for cloud-native environments. ... Protecting the network is traditionally integral to on-premises environments but is equally important for the cloud. There are two major components of network protections. One is microsegmentation, a method of creating zones to isolate workloads from one another and secure them individually. This is at the heart of zero trust. By putting up roadblocks between applications and workloads, microsegmentation makes it much more difficult for would-be attackers to move laterally from one infected host to another. The method employs containerization (of the app and its operating environment) and segmenting the application itself in order to minimize any damage.


Microsoft told employees to work from home. One consequence was brutal

Perhaps, you might say, no one's really working any harder then. Yet when you're in an office, don't you also take time out to go for a walk (and scream at your boss), have a peaceful lunch (and scream at your boss), call your cable provider (and scream at customer service) or merely stare into space (and scream at the absurdity of existence)? The problem -- and for some bosses, great delight -- of modern technology is that it makes you believe employees are available any time, any place, anywhere. And really, how many humans are at their best earlier than they're used to or later than they'd prefer to Please, I'll get to the happier elements of this research shortly. But when working from home Microsoft's employees apparently spent 10 percent more time in meetings. So, let's see, your work hours have expanded and you're spending more time in meetings. Where's the hope? Well, the researchers muse that there needed to be more meetings because there wasn't the opportunity for chance encounters. You know, in corridors and restrooms. And they believe hope lies in the fact that individual meeting times were shorter.


How to Build a Security Culture

Content is one of the biggest mistakes made in security awareness training. If your content is weak, boring, unrelatable, or filled with legal language, no one will pay attention. Although your intentions are great, you have to understand that dry paragraphs of plain text about hackers will not influence a behavior change. As we learned before, to create a culture you have to drive influence. And to drive influence, you need support. Just sending out an email once a month or once a quarter, or hanging a poster up that says ‘don’t get phished’ will do nothing to make an impact. In order to create a security culture shift, you need to understand what drives change. Change is not easy, and when it comes to employees changing their behavior, you have many barriers ahead. Change requires taking an established habit, associating that habit with negative behavior, and then influencing a new habit with a desired, positive outcome. Essentially know why something they are doing is wrong and learning how to change the negative habit they’ve been demonstrating. So now that we learned all of the challenges in creating a culture of security, how do we actually create one ourselves?


Use cases for blockchain in healthcare

One major issue that is present within healthcare is the production of counterfeit prescription drugs. The World Health Organisation (WHO) has estimated that one in 10 medical products in low and middle income countries are forged or substandard. Companies such as Quant aim to solve this issue using smart contracts and interoperability between blockchains to cut out middlemen and increase efficiency. “Data from embedded identification markers used to track individual products and components, can be recorded onto distributed ledger technology (DLT) to provide a single source of truth with full transparency, accuracy, and accountability at every stage in the supply chain,” explained Gilbert Verdian, founder and CEO of Quant. “This is achieved through the shared nature of the ledger and the immutability that it offers, and with the data available to all participants, this solution has the potential to eliminate the need for intermediaries – and hence, opportunistic criminals – abusing the system. “The impact of such an approach would be dramatic. In fact, according to a new report by the market intelligence company BIS Research, blockchain-based supply chains would reduce revenue loss to pharmaceutical companies by up to $43 billion annually, as well as benefit others who inadvertently purchase counterfeit drugs.”


Data scientists are used to making up the rules. Now they're getting some of their own to follow.

Many, if not most, technology-oriented organizations already have ethical standards of some sort, which were developed to ensure that innovation is designed responsibly within their own ranks. The BCS, for example, asks practitioners to sign up to a code of conduct, which determines among other principles that IT workers should act in the public interest, with integrity, competence and diligence; and that they should never take on a task that they don't have the skills to complete. Similarly, the RSS's code of conduct defends acting in the public interest, fulfilling obligations to employers and clients, and showing competence and integrity. And the RAEng is governed by principles of openness, fairness, respect for the law, accuracy and rigor. Even big tech has jumped on the bandwagon, with Google committing to responsible technology, or Microsoft drafting guidelines for 'ethical and trustworthy AI', to name but two.  But while organizations have been pulling together ethics committees and writing up white papers on the rules that should govern the use of data, not much was done at the individual level. Yet the source of all technology is the brain of those who come up with new ideas. 


Cybersecurity for a Remote Workforce

Start with stopgap measures that can be implemented immediately, such as revising existing cyber risk guidelines, requirements, and controls on how employees access data and communicate with a company’s network. Rules of behavior analytics need to be adjusted to consider changes to the “normal” behavior of employees, many of whom now work outside standard business hours so that security teams can effectively focus investigations. Then examine new security tools and requirements for sharing and maintaining private information with vendors. For example, organizations may need to adopt more robust data loss controls, traffic analysis tools, and access restrictions. Ensure that vendors that aren’t currently prepared for heightened cyberattack risk commit to developing cyber preparedness plans to safely handle information or interact with your corporate network. Review changes to boost your technology and security infrastructure today, even if such changes may take years to implement. Some organizations may want to speed up their cloud strategies so that their IT resources can rapidly meet demand spikes from large-scale remote work.


Digital transformation: 8 ways to spot your organization's rising leaders

The best digital transformation leaders know what the biggest pain points are inside the organization, says Lyke-Ho-Gland – and they create a digital roadmap addressing those points that the larger organization will get behind. ... “Outcome-focused leaders understand the need to drive that focus, assess any midcourse requests against the program commitments, and communicate relentlessly to reinforce expectations of sponsors.” They understand, measure, and report on both qualitative and quantitative benefits and make sure all project actions are structured to deliver those outcomes. ... “The most successful DT leaders can compellingly market those solutions to business stakeholders so that they adopt the new tools and ways of working,” says Lauren Trees, who heads up APQC’s Knowledge Management research group. ISG’s Hall describes one successful CIO he worked with as the best salesperson in the organization: “He had implemented all of the company’s products within IT (eat your own cooking) and talked to prospects daily on the challenges he was able to overcome with the product suite,” Hall recalls.


Block/Allow: The Changing Face of Hacker Linguistics

The most recent wave of changes demonstrates that more, and more powerful, tech organizations take watching their language as a serious concern, even though the history of the terms predates their use in computing, says Christina Dunbar-Hester, an associate professor of communication at the University of Southern California and the author of "Hacking Diversity: The Politics of Inclusion in Open Technology Cultures." "Language is symbolic and powerful but can also feel superficial. Certainly in the moment we're in, some people are asking to abolish the police, not to change unfortunate computer terms," she says. "But Black Lives Matter and the current moment gives people the ammunition to say that language does matter." However, there's a difference between changing word choices in documentation and getting people to change the words they use on a daily basis. Convincing developers, hackers, and other professionals to switch to more inclusive language has been a long struggle that predates the current norms. Tech has long faced a serious imbalance in how it pays and promotes white men more than women and black, indigenous, and people of color.


Data governance and context for evidence-based medicine: Transparency and bias in COVID-19 times

A number of people, including Cochrane excommunicate Peter Gøtzsche, argue that there can be a lot of bias in RCTs. This has largely to do with the fact that the vast majority of RCT data come from pharmaceutical companies, creating a conflict of interest. If aggregators like Cochrane do not validate the raw data they offer access to, they may be whitewashing them. Case in point: Surgisphere. What was initially referred to as the most influential COVID-19 related research up to date was called into question as to the result of lack of transparency regarding the origin and trustworthiness of its data. The research used data sourced from Surgisphere, a startup claiming to operate as a Data Broker, providing access to data from hospitals worldwide. However, whether that data is veracious, or was acquired transparently is not clear. As a result, research findings were put into question, and related decisions made by the WHO were reverted. Scales' opinion is that researchers have a responsibility to verify the source of the data they use. ... Over-reliance on RCTs may be part of the problem. RCTs can be enormous multi-year undertakings, summarized in what's often an eight-page journal article. Many important details and potential biases are left out. 



Quote for the day:

"Leadership means forming a team and working toward common objectives that are tied to time, metrics, and resources." -- Russel Honore

Daily Tech Digest - July 26, 2020

Researchers develop new learning algorithm to boost AI efficiency

A working group led by two computer scientists Wolfgang Maass and Robert Legenstein of TU Graz has adopted this principle in the development of the new machine learning algorithm e-prop (short for e-propagation). Researchers at the Institute of Theoretical Computer Science, which is also part of the European lighthouse project Human Brain Project, use spikes in their model for communication between neurons in an artificial neural network. The spikes only become active when they are needed for information processing in the network. Learning is a particular challenge for such less active networks, since it takes longer observations to determine which neuron connections improve network performance. Previous methods achieved too little learning success or required enormous storage space. E-prop now solves this problem by means of a decentralized method copied from the brain, in which each neuron documents when its connections were used in a so-called e-trace (eligibility trace). The method is roughly as powerful as the best and most elaborate other known learning methods.


Data Leadership Book Review and Interview

The Data Leadership Framework is first about acknowledging that there is a whole bunch of stuff an organization needs to do to make the most of data. The five DLF Categories are where we evaluate an organization’s data capabilities and figure out where they are struggling most among the complexity. The twenty-five DLF Disciplines are where we then focus energy (i.e., invest our limited resources) to make the biggest outcomes. By creating relative balance across the DLF Categories, we maximize the overall impact of our data efforts. This is what we need to be doing all the time with data, but without something like the Data Leadership Framework, the problems can feel overwhelming and people have trouble figuring out where to start, or what to do next. This is true of everybody, from data architects and developers to the CEO. If we can use the Data Leadership Framework to make sense amidst the chaos, the individual steps themselves are much less daunting. Data competency is no longer a “nice-to-have” item. From data breaches to analytics-driven disruptors in every industry, this is as big of a deal to businesses as cash flow.


Enterprise Architecture for Managing Information Technology Standards

While globalization is excellent for business as it extends opportunities for markets that were previously closed and permits the sharing of ideas and information across different platforms, it could threaten the budgetary plans of SMBs. Investments in licensing, infrastructure, and global solutions, in general, hit this segment harshly. Lack of Talent Pool: This problem is primarily limited to the technology segment. Around half of employees lack the critical thinking skills that would qualify them to grow further in this field. The IT team faced the most significant hurdle so far is having members that aren’t smart enough to put a general hardware and software security environment cost-effectively. IT Policy Compliance Failure: Specific technologies used by IT projects don’t comply with the policy rules as defined by their departments. IT departments are sometimes unaware of techniques used by their teams and business stakeholders, increasing the risk of uncontrolled data flows and non-compliance. Besides, these technologies are sometimes incompatible with the existing portfolio. This increases IT debt, primarily if technology standards are not enforced.


IoT Architecture: Topology and Edge Compute Considerations

Network engineers often have experience with a particular topology and may assume it can be used in any setting, but sometimes another choice would be more optimal for a different use case. To determine whether a mesh networking topology is a good choice for your application, it is important to understand the pros and cons of this strategy. A critical factor to analyze is your system's timing requirements. Mesh networking topologies route data from node to node across a network that is architected in a mesh. So the "hops" need to be accounted for due to added latency. Do you need the data back in 100 mS or can you live with once a second? ... Wireless point-to-point (PTP) and point-to-multipoint (PTMP) are topologies used for connectivity in a wide range of applications, such as use cases where you want to replace cables with wireless communication. These protocols communicate between two devices (point-to-point) or from one device to many (point-to-multipoint). There are a few factors to consider, such as distance, timing and battery power that may indicate if a PTP network is needed versus a mesh network.


An introduction to confidential edge computing for IoT security

Recent attacks, even outside of IoT, showed that hackers exploited weak configurations of public cloud services to access sensitive data. The reason that hackers succeeded in obtaining sensitive information stored on a public cloud had nothing to do with the security mechanisms implemented by the cloud provider but were rather the result of little mistakes made by the end users, typically in the Web Application Firewall (WAF) that controls the access to the cloud network or by leaving credentials unprotected. These little mistakes are almost inevitable for companies that have a cloud-only infrastructure. However, by demarcating sensitive and non-sensitive information, this could help their IT teams in setting up the cloud services to achieve safer security practices. Those mistakes emphasize the need for a broader security expertise aiming at defining the security architecture to be enforced on the overall system and at finding out whether the security features of the cloud provider need to be completed by additional protection mechanisms. A first logical step consists of demarcating sensitive and non-sensitive information, to help the IT team establish appropriate priorities.


How IoT Devices are Rapidly Revolutionizing the World of Small Businesses

Small business owners may want to take some time to look through a list of the top IoT software rankings before they decide on a single platform. It can be difficult to migrate to another one after your firm has become heavily invested in a certain type of technology. This is especially true of those who plan to primarily use consumer-grade equipment that often goes through various revisions as market pressures force engineers to redesign certain aspects of their builds. Keep in mind that all Internet of Things devices include some sort of embedded general purpose computer. This means that each piece of smart equipment is free to share information collected from onboard peripherals. That makes it easy to learn more about how different circumstances impact your business. Think of a hotel or restaurant that has multiple rooms. Each of these have an adjustable thermostat. If some of them are set too high or low, then the business in question may end up losing thousands by using too much energy. A number of service providers in the hospitality industry now use IoT software to monitor energy usage throughout entire buildings.


The Journey to Effective Data Management in HPC

High Performance Computing (HPC) continues to be a major resource investment of research organizations worldwide. Large datasets are used and generated by HPC, and these make data management a key component of effectively using the expensive resources that underlie HPC infrastructure. Despite this critical element of HPC, many organizations do not have a data management plan in place. As an example of data generation rates, the total storage footprint worldwide from DNA sequencing alone is estimated at over 2 Exabytes by 2025, most of which will be processed and stored in an HPC environment. This growth rate causes an immense strain on life science organizations. But it is not only big data from life sciences that is stressing HPC infrastructure, but research institutions like Lawrence Livermore National Labs (LLNL) also generate 30TB of data a day. This data serves to support their research and development efforts applied to national security, and these daily data volumes can also be expected to increase. As the HPC community continues to generate massive amounts of file data, drawing insights, making that data useful, and protecting the data becomes a considerable effort with major implications.


Taking A Deep Look at DLT (Distributed Ledger Technology)

A great deal of effort and investment is continuously going into mitigating blockchain’s scalability issues. One of the headline motivations for this directive is to level-up the user experience on blockchain networks to accommodate a diverse range of concurrent activity without compromising any of the blockchain elements. When this is achieved – blockchain architects and companies will have a more comprehensive suite of blockchain tools to meet new and growing needs in the market. For a long time blockchain has been unfairly subjected to pessimistic scrutiny that undermine its value. Unfair in a sense that blockchain is brilliant, revolutionary and still young. But then again, nothing exists in a vacuum totally free from pessimistic sentiments. Everything in existence has some criticism attached to it. Even so – blockchain is resilient! It is here for good – and so is DLT. If you look at DLT you will see that many DLT based start-ups offer business-to-business solutions. Distributed ledgers are well poised for companies because they address multiple-infrastructural issues plagued in industries. One of them is databases. Given how disparate and complex organizations have grown - legacy databases have become victim to inefficiencies and security loopholes.


Adapting online security to the ways we work, remotely and post-coronavirus

Not only were many companies unprepared for the mass transition to remote work, but they were also caught off guard by the added technology and security needs. According to CNBC, 53 senior technology executives say their firms have never stress-tested their systems for a crisis like this. For example, when companies are working from the office, it is easier for IT teams to identify threat agents that make attempts into systems since hackers’ locations are removed from those offices. However, with employees dispersed at their homes, identifying these foreign breaches are less recognizable. Companies have also been caught flatfooted during this crisis by relying on employees to use their personal devices instead of providing a separate work device. This prevents IT teams from identifying suspicious activity. To keep employee and company information secure, it is up to the CISO and IT decision-makers to create and strictly enforce a regular practice for accessing, editing and storing their data. Most employees value productivity over security. This is problematic. Employees gravitate towards tools and technology they prefer to get their work done effectively.


Is Your Approach to Data Protection More Expensive than Useful?

Now more than ever, data is the lifeblood of an organization – and any incidence of data loss or application unavailability can take a significant toll on that business. With the recent rise in cyberattacks and exponential data growth, protecting data has become job #1 for many IT organizations. Their biggest hurdle: managing aging infrastructure with limited resources. Tight budgets should not discourage business leaders from modernizing data protection. Organizations that hang onto older backup technology don’t have the tools they need to face today’s threats. Rigid, siloed infrastructures aren’t agile or scalable enough to keep up with fluctuations in data requirements, and they are based on an equally rigid backup approach. Traditional backup systems behave like insurance policies, locking data away until you need it. That’s like having an extra car battery in the garage, waiting for a possible crisis. The backup battery might seem like a reasonable preventive measure, but most of the time it’s a waste of space, and if the crisis never arises it’s an unnecessary upfront investment, more expensive than useful. In the age of COVID-19 where cash is king and onsite resources are particularly limited, some IT departments are postponing data protection modernization, looking to simplify overall operations and lower infrastructure cost first.



Quote for the day:

"Do what you can, where you are, with what you have." -- Teddy Roosevelt

Daily Tech Digest - July 25, 2020

Jump Over the Hidden Barriers to Digital Transformation

A key reason for the failures: a lack of clear, measurable business goals. Right now, no matter the business, eyes everywhere are on IT. To push through the transformation, you need to benefit not just yourself but your organization. You need to be able to speak the language of business, and here’s how you can do it. If you’re going to ask the company to support your program, talk about it in a way the C-suite can understand. Think about the use case they’ll be most interested to hear about. Shortening customer wait times? They’ll care. Aiding supply chains? They’ll listen. Remember, they’ll want numbers, too. The C-suite will do a cost-benefit analysis, but you should preempt this by conducting one beforehand. This shows your commitment to making the company more successful and inspires trust in the project through data. Avoid using too much technical jargon; doing so will get glassy-eyed looks and perhaps inspire them to give some other department a budget increase or greenlight a different project. Never make the C-suite do your work for you: Make them understand the value of your plans. Doing so may develop your soft skills.


6 Insights for Dynamic Leadership

Too often we get caught up in the inequity of having a problem. It’s unfair that we should suffer this or “that should have never happened in the first place!” What we need to focus on is how to safely address the problem, how to move forward, around, over or under it to get where we need to go. ... Focusing on “being decisive” misses the lesson. Start thinking about the information — what you have, what you need and whether you can wait for more. The goal is be able to reflect upon decisions and know that you would make the same one again tomorrow — even if they were wrong. ... If it’s not someone’s job, it’s no one’s job. And so, we’re taught to identify a specific person to carry out a task in order to get the task done. When it can be anyone’s job to cook dinner, you’ll be hangry by 7:00. But when you split the cooking every other day, you will go hungry a lot less often. As meetings end, tasks should be given to people and as specifically as possible with closed-loop communication built into the responsibility. ... Control yourself — it’s pretty much the only thing you can control. It’s the doctor whose face says “Everything’s cool — I got this.” It’s the pilot’s smooth, buttery drawl that announces the severe turbulence over the PA. It’s the expert in the room defusing an insane customer or devolving employee.


Management lessons of The True Believer

The True Believer is no less relevant today than when it was first published, despite its pessimistic view of human nature and skepticism toward mass movements. Several elements are of particular interest to business leaders. For example, Hoffer recognizes that “the chief passion” of the frustrated is to belong, and urges employers to cultivate “a vivid feeling of solidarity” in employees through collective pay schemes and other means, as teamwork boosts productivity: “Any policy that disturbs and tears apart the team is bound to cause severe trouble.” Another important source of belonging is the family. Dangerous mass movements, in Hoffer’s view, tend to undermine and be jealous of the family, which is yet another reason firms ought to be friendlier toward that beleaguered institution. Humans find passionate causes seductive, Hoffer knows, but he seems not to subscribe to any cultlike conception of a business. “The practical organization offers opportunities for self-advancement, and its appeal is mainly to self-interest,” he writes, adding that: “Where self-advancement cannot, or is not allowed to, serve as a driving force, other sources of enthusiasm have to be found if momentous changes, such as the awakening and renovation of a stagnant society or radical reforms in the character and pattern of life of a community, are to be realized and perpetuated.”


Microsoft Office the most targeted platform to carry out attacks

Researchers said that ... hacking browsers has become more expensive, as browser security has improved. “Browser developers put much effort into different kinds of security protections and mitigations,” Liskin said. “Attackers were looking for a new target, and MS Office has become a star.” Liskin added that there are plenty of reasons why cybercriminals choose to attack the popular suite. “Microsoft Office has a huge number of different file formats," he said. "It is deeply integrated into the Windows operating system." He also argued that when Microsoft created Office, it made several decisions that, in hindsight, aren’t optimal security-wise and are currently difficult to change. Making such alterations would have a significant impact on all the versions of the products, Liskin said. A new report from SonicWall released in July 2020 shows this trend is growing. Office files have overtaken PDF documents as a delivery mechanism for malware. Office documents make up 22.4% of all malicious file types, compared to 10.7% for PDFs. A bit of good news in the SonicWall report: The number of detected malicious Office files declined slightly at the end of the first half of 2020. 


Intel's 7nm products delayed; first 7nm client CPU expected in late 2022 or 2023

Intel is pushing back its 7nm product roadmap after identifying a defect mode in its 7nm process that resulted in yield degradation, Swan said. The yield of Intel's 7nm process is now trending approximately 12 months behind the company's internal target. "We've root caused the issue, and believe there are no fundamental roadblocks," Swan said. "But we've also invested in contingency plans to hedge against further schedule uncertainty. We've mitigated the impact of the process delay on our product schedule by leveraging improvements in design methodology, such as die disaggregation and advanced packaging." The news of the delay caused Intel shares to sink in after-hours trading. The delay comes in the context of Intel's challenges to transition to 10nm products, with its product roadmap repeatedly delayed.  "We've seen this movie before," Swan acknowledged Thursday. "We have learned from the challenges in our 10nm transition, and we have a milestone-driven approach to ensure our product competitiveness is not impacted by our process technology roadmap."


Self-Driving Money Is Coming To Consumer Fintech

The first step to autonomous finance is breaking down the barriers between these products. Open banking solutions like Plaid, which link fintechs and banks together, have made it easier to transfer money and data between platforms. Today, that looks like a hub-and-spoke model: I can move money from Venmo to my checking account to Vanguard. In the future, it will be point-to-point: I should be able to take $400 at-rest in Venmo and invest it directly into my Roth IRA, or split it 50/50 between my student loan payments and my credit card bill. Self-driving money limited to one app is like a self-driving car that only works on one road. The second step is where the ‘autonomous’ part comes in. Connected fintech services will use a combination of common-language rules set by the user and machine learning to manage money in the background. This goes a step beyond setting a retirement goal on a roboadvisor: I should be able to say “whenever I have spare money laying around, other than what I need for day-to-day expenses, reinvest it into whatever earns the highest return.” After that, I should never have to think about what’s happening with my money, other than when I receive updates from the service on how it is being put to work.


Low-Code Technology Boosts The Growth Of Specialist Bank

During its low-code journey, HTB invested heavily in testing capabilities, providing value with an improved turnaround time for any defects. Previously, developers would publish a change, finishing in the evening, then the test team would arrive the next morning and start the test pack, which could run for 3-4 hours, ensuring everything worked correctly and highlighting any regressions. The developers wouldn’t get feedback until lunchtime, therefore losing half a day of development time. Now, the developers publish an update and leave for the evening. Liberty Create takes 30 minutes to package the release and push it to the test environment, waking up the testing platform automatically once complete and running the series of tests. By 9 am, the test team starts the day with the results and the developers work on any fixes needed immediately. As a result, an extra half a day per developer is gained from every push. This acted as the first step for HTB on its journey to seamless integrated testing and DevOps. Today, HTB’s confidence in front-end building capabilities now influences how the bank approaches new potential suppliers with a clear strategy that needs to work with low-code.


Security Leaders Adapt to Manage Cyber Everywhere

We have placed a significant focus on our early talent development program, bringing in people who understand the business and can be trained to perform risk assessments and develop the necessary technical skills. Mentoring young professionals is one of my passions, and it is essential to develop the cybersecurity skills we need now and in the future. It’s important to make the time, however busy our schedules are, to help shape people into more than they thought they could be. In terms of collaboration across the profession, we are participating in the recently formed Health Information Sharing and Analysis Center (H-ISAC) in Japan, which is a community of life sciences organizations that have come together to share timely, relevant, actionable information on cybersecurity. Although we are competitors in business, we share a common goal to prevent, detect, and respond to cybersecurity concerns. We face many of the same challenges with respect to resources and professional staffing, so it helps all of us if we can work collaboratively.


Cloud Computing – Trends that Enterprises Should Watch Out For

The prevalence of mobile phones has majorly affected the business world. Anyplace, anytime access that these cloud-based apps provide turns to be perfect for remote working employees. Employees can essentially sign into any application with web-enabled devices like tablets or mobile phones to carry out their tasks in the cloud infrastructure. Information breaches, theft, and data omission are the major threats even for conventional IT infrastructures. But, as more organizations move to the cloud platforms, it’s crucial to guarantee that cloud service provider can ensure a secure framework for the wellbeing of their consumer’s information. Cloud security isn’t the only trend in cloud computing, but it’s important to be emphasized enough on by each organization. Consequently, an enormous demand for cloud security suppliers is emerging that guarantees that the data practices abide by GDPR and other compliances. With open-source cloud computing, firms can see various advantages. They can quickly scale their cloud foundation, including exceptions that are a lot more direct than the closed source platform, with fewer security concerns. 


Banning TikTok Won't Solve Our Privacy Problems

While all these drivers are legitimate concerns — we should express concern when a nation-state owns an application that is harvesting huge amounts of sensitive data — our focus on these factors conveniently bypasses the true problem. Applications are becoming increasingly more intrusive and we are surrendering our data ever more willingly without understanding the potential ramifications that will ripple far into the future. Once our data has been leaked, it is out there and we can't ask nicely to please have it back. That means if data we once thought was innocuous suddenly changes into something dangerous, perhaps because of a new piece of technology or a change to how we use data, then you are already at a disadvantage. Banning apps based solely on their country of origin (no matter how hostile) is not going to solve this problem; it is merely a Band-Aid that won't fully address all privacy and security concerns.  We need to address the underlying problem, take a hard look at what data our applications are collecting, and focus on improving privacy controls. We could throw a dart at a list of apps in most app stores and almost be guaranteed to hit one with some form of privacy issue.



Quote for the day:

"If you are not willing to give a less experienced qualified professional a chance, don't complain you are charged double for a job worth half." -- Mark W. Boyer

Daily Tech Digest - July 24, 2020

The challenges and opportunities of shadow IT

As more organizations adopt practices like self-service SaaS and BYOD, the need for greater visibility into their overarching corporate network of devices becomes even greater. Many organizations faced this crunch when moving their workforce remote only a few months ago as a response to COVID-19. Typically, the larger and more widespread an ecosystem of devices is, the more difficult it becomes for IT teams to maintain visibility and consequently cyber hygiene of those devices. We can expect many of the challenges around Shadow IT to only grow in the next few years as more enterprises adopt practices like BYOD, or even on an operational level, more flexible remote work policies. Consequently, enterprises will put a greater focus on automation to better identify and secure devices across their widened infrastructure. ... SaaS tools bring immediate dangers of freely shared file data that is not classified or labeled. Or to say this in a more technical manner, there is zero data governance in collaborative hybrid work environments over shared files. DLP tools fail to bring effective results in shared environments. For effective data protection, organizations must have virtual file labeling that offers an automated process in which all the relevant security, privacy, and operational policies are considered, and continually fine-tuned.


Open Banking – The Novel Mainstay of Digital Banking

Open banking is a safe way to give suppliers access to your financial data. It is establishing a statistics architecture, where a group of organizations can share the information via Application Programming Interfaces (APIs). These APIs are used by banking and financial companies to exchange data between them, thus helping to serve consumers better. Open banking allows banks to offer customized financial services to their consumers, majorly payment solutions. The revolution is both developing the industry toward platform-based, hyper-relevant distribution, and offering banks a precious opportunity to develop their networks and extend reach. In short, we can say open banking is more about sharing financial data by electronic means, securely, and only under circumstances when consumers agree. Therefore, when you share data voluntarily owing to legal reasons, you become a part of the open banking community. Gear up for a world of websites and apps, where one can select modern economic services and products from providers policed by the Financial Conduct Authority (FCA) and European equivalents.


Balancing UX and Privacy With IoT

When we consider the compromised privacy of individuals, we are talking about each individual’s loss of control over personal information. When people invest in these interconnected devices, they are not entirely aware of how much of their personal information is tracked and saved by the manufacturer in a bid to improve user experience. An individual can lose control if someone hacks into their smartphone or computer and remotely operates other devices. There’s no doubt that our smartphones carry a majority of our information. They are linked to our bank accounts, email accounts and even systems that need authorization. In fact, experts predict that there would be about 31 billion connected IoT devices by the year 2021. Usually, hackers employ methods that are undetected, so more connections would mean an increase in hacking activities as well. The data collected from an individual’s smartphone or laptop can give hackers a detailed look into their activities, including internet searches and purchasing power. The information is typically used to work on user experience, but also can be used to target particular products to the individual. Sometimes, this data is even sold to other organizations that are looking for a target audience to sell their products.


Improving data management in the life sciences industry

If an email account is breached, the data on that user’s account will be visible to the attacker. Should emails featuring product artwork or containing sensitive information be visible to the employee, they will also be accessible to a cybercriminal who has admittance to the account. Despite stricter serialisation regulations and the efforts of the wider industry, the full supply chain remains at risk of this information being sold to counterfeiters. Addressing this possibility should be a priority for regulators now that a number of serialisation laws in key markets are over the line. New technologies provide opportunities to deliver better communication and collaboration while ensuring compliance and security. Often, this cannot be guaranteed by unsecure tools like email. Using platforms or systems that offer a shared workspace, accessible by multiple organisations, enables collaborative project management, with a clear, immutable audit trail. They also support companies in the gathering and analysis of data which has a number of high value use cases. One of the most promising and impactful will be improved supply and demand forecasting.


How Does Data Management Drive Efficiency for Organizations?

In an ideal business world, many different Data Management professionals collaborate and execute best practices to extract the maximum business value from their enterprise data assets. These professionals are data architects, data engineers, data modelers, DBAs, developers, data quality experts, and data governance experts, who work alongside executives and high-level, decision-makers to conceptualize, design, develop, and implement the desired Data Management infrastructure. Data Management teams often work with real-time data, which requires superior data capture, data integration, data preparation, and data analytics platforms — now available due to AI and ML. Many associated technologies like data fabric, graph processing, IoT, big data, edge computing, and so on need to work in conjunction with each other to make the unified Data Management system work. At a more nitty-gritty, technical level, complex Data Management tasks happen through Metadata Management, Master Data Management, advanced data compliance tasks, and continuous monitoring. A relatively new Data Management effort creates “data catalogs” to document which data is available where, including business glossaries, data dictionaries, and data lineage records.


Automating Machine Learning: Google AutoML-Zero Evolves ML Algorithms From Scratch

Evolutionary algorithm (EA) is a subset of evolutionary computation, a family of population-based trial and error problem solvers with a metaheuristic or stochastic optimization character. In evolutionary computation, an initial set of candidate solutions is first generated and then iteratively updated. Each new generation is produced by stochastically removing less-desired solutions and introducing small random changes. Evolutionary algorithms use mechanisms inspired by biological evolution such as reproduction, mutation, recombination, and selection. EAs often perform well in approximating solutions to a range of problems that would otherwise take too long to exhaustively process. The use of evolutionary principles for automated problem-solving was formally proposed and developed more than 50 years ago. Artificial evolution became a widely recognized optimization method as a result of the work of German researcher Ingo Rechenberg, who used evolution strategies to solve complex engineering problems in the 1960s and early 1970s. In 1987, Jürgen Schmidhuber published his first paper on genetic programming, and later that year described first general-purpose learning algorithms in his diploma thesis, Evolutionary Principles in Self-Referential Learning.


Rise of automation creates new RPA job descriptions

Automation engineers by nature need to have a broad set of capabilities in order to support a mix of no-code platforms, API integrations and traditional coding practices to build fully functional offerings for clients. Traditional development teams often look for talent that has deep capabilities in narrow fields. In contrast, Cottongim said automation engineers should be conversant in a wide variety of tools and techniques but not necessarily a master in any one. Automation engineers will also need to have skills beyond traditional roles for engaging with their business partners and being able to distill business needs into rapidly executed automation offerings. They will also need to be able to apply a customer-centric view and build in an agile manner, while partnering closely with their business teams. Cottongim also expects to see more demand for cloud architects and cloud engineers that can support intelligent automation needs. They will need to understand how to create applications built from a mix of VMs, databases, networking and high-availability management techniques. 


Is Open Finance worth getting excited about, or is it just spin?

On the bright side, there are protections in place and limitations; overseen by the regulator. Users completely own their data and can revoke the access they give to third-parties at any time. There are also restrictions on companies’ ability to sell the data directly to third-parties. Instead, companies holding the data can monetise it by recommending new pension providers and taking a commission fee, for instance, or charging consumers for the service (like Monzo has done). “What’s going to make or break the success longer term is ‘do you feel confident that you know where this data is going?'” Grose noted, highlighting the need to educate users on their data rights and companies’ use of their data. Nonetheless, Levine warned that some companies might be tempted to charge a so-called ‘privacy premium’, whereby consumers get a worse deal or product based on their financial data. “It only takes one kind of major loss of trust or issue that we find ourselves in a place where actually the whole industry is hurt, and we may be going backwards,” Levine said. Meanwhile, Vans-Colina added there’s a big risk that open banking and finance data will get hacked and leaked.


Data Governance: Stay Non-Invasive in Your Approach

People naturally rebel against the idea of being governed. Data governance is known in some circles as “People Governance” because it is people’s behavior – how they define, produce and use data – that is being governed. In other words, the data will do what we tell it to do, so we must govern people’s behavior if we want to improve the quality, value, and understanding of the data. Therefore, the approach the organization takes to govern the data (and the people) can make or break whether the data governance program is accepted or rejected by the organization. I have been known to say that, “the data will not govern itself.” Let me add to that with, “the documentation about the data, or the metadata, will not govern itself either.” Most of us have experienced data and metadata that has been left ungoverned. Why? Because people are not held responsible for the quality and/or value of the data or the documentation. As a result, there is no way to improve the efficiency and effectiveness of the way data assets are being leveraged. Ungoverned data is replicated many times over with many different versions of the “same” data. 


Creating a modern data governance strategy to accelerate digital transformation

Though it’s early in our journey toward modern data governance, we do have a few best practices to share. Primarily, we recommend that you address your data governance strategy holistically. As illustrated below, we designed our approach so that standards, embedded into the engineering process and data centralization on the modern data foundation worked together to ensure end-to-end modern data governance. Build standards into your existing process and implement them as engineering solutions. By approaching data governance during the design phase of the larger Enterprise Data strategy, we have been able to institutionalize “governance by design” into the engineering DNA—and apply it to data at every touchpoint. We are building our data governance controls into the centralized analytics infrastructure and analytics processes. Consider implementing a modern data foundation with integrated toolsets. The EDL, with its built-in governance services and capabilities, does more than scale data governance efforts—it enables enterprise analytics for the whole organization. 



Quote for the day:

"Be so good at what you do that no one else in the world can do what you do." -- Robin Sharma