Daily Tech Digest - October 22, 2019

Agile Development: How to Pick the Most Valuable User Stories

A goal that's big enough to be worthwhile will usually have multiple actors involved -- these correspond to the roles of the standard user story template. In a retail environment, improving customer loyalty would involve not only the customer but also the parts of the company that the customer interacts with (shipping, ordering, marketing). This is the second level of the hierarchy: the actors involved in achieving the goal. For any actor, there are probably multiple "impacts" that need to be achieved. For example, if we want to improve customer loyalty, then we want customers to be more satisfied with us, to order more frequently from us and to buy more stuff when they do order from us. These impacts form the third level of an impact map and correspond to both the needs and reasons portions of the standard user template. This means that an impact isn't a deliverable: "Improving shipping" isn't an impact; instead "improving shipping" is a deliverable that might contribute to achieving that "improved customer satisfaction" impact. A good impact also has a measure associated with it that allows the organization to tell when it's been achieved.


The Data Breach Game: The 9 Worst IT Security Practices


Do you have one service account for all of your production servers? Or worse -- I saw this at a client once -- do you have linked servers between all of your database servers, and have those accounts logging in as the system admin? To take this a step further, in your personal life, do you reuse passwords across Web sites? It's not good to do that, even at sites that don't impact your finances. Those small sites are the most likely to get breached, and then the password you used at your favorite cat-grooming message board and your bank is now out there on the dark Web. ... As always, be careful what you click on, especially in e-mail. E-mail is one of our biggest productivity tools, but it's also the biggest security vulnerability in any organization. To be helpful, use a managed e-mail service like Office 365 or commercial Gmail and multifactor authentication (MFA). Bonus points if you don't use text messages for MFA. Finally, make sure that your network is segmented in a way that your CEO opening an e-mail can't infect your domain controllers or database servers.


New mainframe uses: Blockchain, containerized apps

mainframe servers in the cloud
Forrester's research found mainframes continue to be considered a critical piece of infrastructure for the modern business – and not solely to run old technologies. Of course, traditional enterprise applications and workloads remain firmly on the mainframe, with 48% of ERP apps, 45% of finance and accounting apps, 44% of HR management apps, and 43% of ECM apps staying on mainframes. But that's not all. Among survey respondents, 25% said that mobile sites and applications were being put into the mainframe, and 27% said they're running new blockchain initiatives and containerized applications. Blockchain and containerized applications benefit from the integrated security and massive parallelization inherent in a mainframe, Forrester said in its report. "We believe this research challenges popular opinion that mainframe is for legacy," said Brian Klingbeil, executive vice president of technology and strategy at Ensono, in a statement. "Mainframe modernization is giving enterprises not only the ability to continue to run their legacy applications, but also allows them to embrace new technologies such as containerized microservices, blockchain and mobile applications."


Sodinokibi Ransomware Gang Appears to Be Making a Killing

Sodinokibi Ransomware Gang Appears to Be Making a Killing
The group behind Sodinokibi appears to have had a head start on its success. While it's not clear what relationship the GandCrab and Sodinokibi gangs might have, researchers report seeing a clear code overlap in their malware. Security firm Secureworks says that based on multiple clues it believes that the threat groups behind GandCrab and Sodinokibi - aka Sodin and REvil - "overlap or are linked." In other words, one or more developers may not have retired with GandCrab, but helped set up a new operation. Like GandCrab, a customized version of Sodinokibi gets supplied to each individual affiliate, who infects systems with the malware and then shares a cut of the proceeds with organizers. Some affiliates appear to be more technically skilled than others. Coveware, a Connecticut-based ransomware incident response firm, says that at least one affiliate group specializes in hacking IT service providers as well as managed security service providers. Doing so enables the affiliate to distribute the ransomware to hundreds or thousands of endpoints managed by the service provider.


IaaS vs. PaaS options on AWS, Azure and Google Cloud Platform


Many early PaaS providers restricted which technologies they supported, and their software tools were compatible only with their own hosting platforms. It was difficult to migrate from one PaaS offering to another, or adapt a PaaS-based development pipeline to run on a generic IaaS instead. As businesses increasingly sought freedom from cloud lock-in, PaaS became more software-agnostic. Open source options, such as Docker containers orchestrated by Kubernetes, replaced some proprietary tooling. As a result, cloud computing vendors that originally specialized in IaaS added PaaS offerings, and increased compatibility with their respective IaaS offerings. For example, some versions of AWS CodePipeline, a continuous delivery service that forms part of a PaaS framework in the AWS cloud, can deploy applications to virtual machines or containers that run on AWS' IaaS.


Top cloud security controls you should be using

Gears in the form of a cloud in a binary field  >  Cloud controls
The misconfigured WAF was apparently permitted to list all the files in any AWS data buckets and read the contents of each file. The misconfiguration allowed the intruder to trick the firewall into relaying requests to a key back-end resource on AWS, according to the Krebs On Security blog. The resource “is responsible for handing out temporary information to a cloud server, including current credentials sent from a security service to access any resource in the cloud to which that server has access,” the blog explained. The breach impacted about 100 million US citizens, with about 140,000 Social Security numbers and 80,000 bank account numbers compromised, and eventually could cost Capital One up to $150 million. ... “The challenge exists not in the security of the cloud itself, but in the policies and technologies for security and control of the technology,” according to Gartner. “In nearly all cases, it is the user, not the cloud provider, who fails to manage the controls used to protect an organization’s data,” adding that “CIOs must change their line of questioning from ‘Is the cloud secure?’ to ‘Am I using the cloud securely?’”


Smart Cities are Made Smart by Planning and Strategy

Smart Cities are Made Smart by Planning and Strategy
Sleman Saliba underscored that the real “smart” piece of smart cities comes from the data and insights that come from making those connections. “The important stuff is not only on one premise but really interconnections between companies, between cities, between buildings, and leveraging this information that you get by using technologies across companies and factors,” he adds. Markus John provided a great real-world example of where this approach is making a difference, citing the work of city officials in Mannheim, Germany to take 180 hectares of prime land in the city (made available by the 2013 departure of the US Army) and use it as a boost to accelerating its development as a Smart City. He said that communities making this kind of a change have a unique opportunity, both in the center of the city, and – in particular - its suburbs. “Communities are thinking about how can we manage it smartly - in a modern way?,” he said. “It’s not just about delivering power from outside, but about how to make this suburb intelligent, in the way of buffering energy with batteries, maybe solar panels on the roof, optimizing energy consumption in the suburbs. ...”


Why compliance concerns are pushing more big companies to the cloud

In the current climate and looking into the future, we are seeing an acceleration of workloads from on-prem(ises) infrastructure, from on-prem applications into the cloud. That trend is clearly established. I don't think that's going to change, it's only accelerating. But one of the things which might appear a little bit counterintuitive, is as the adoption curve, the classical bell curve, that you see from early adopters to mainstream before it starts flagging down, we're seeing it's gone past the early adopters, it's more mainstream. But one of the interesting trends that I'm seeing is two issues are popping up. One is why there's an inexorable push to move the workload to the cloud, your data to the cloud, your applications to the cloud, for all the reasons why the cloud is becoming popular. Nobody wants to manage hardware, maintenance, capital efficiency, CAPEX v. OPEX transformation. But along with that, there's literally, I would say linear, almost an exponential concern around data security, data privacy, and regulatory compliance.


An indication of this trend is that European startups are struggling to transform into $1 billion-valued unicorn companies. While a few exceptions share the spotlight, overall this upscaling happens at only half the rate seen in the US. In addition to lack of funding, this also comes down to the fact that Europe is a made up of many distinct countries and that despite its efforts to unite around a single market, fragmentation is still part of its identity. In this context, companies face the challenge of scaling up across a continent that has various different national regulations and structures – a task much more complex than in large homogenous markets like China or the US.  And so, in the past 20 years, Europe's share of "superstar companies" – the top 10% of companies with more than $1 billion in annual revenue – has all but halved. But while the effects of fragmentation are evident, Hjartar explained, Europe could leverage its national differences to use them as a strength. "We have pockets of leaders spread out across the continent," he said. "We have 5.7 million software developers – that compares to 4.4 million in the US. We have all the building blocks to be successful, but now the biggest hurdle is to link them together with an ambitious vision."


The answer is, you can’t — at least, not all at once. It’s next to impossible to pull a giant organization, with hundreds of ingrained processes, divisions, and stakeholders, into a new digital and competitive landscape in one go. Many companies facing this challenge today invest deeply in research and development, thinking that knowledge alone will help right their course. But there’s no correlation between a company’s increased spending in research and development and a stellar performance. It’s not surprising, then, that you feel overwhelmed. You feel you’re on the precipice of profound change, yet you simply can’t move your company in a way that will achieve your goals or realize your vision. The secret to overcoming these challenges lies in the humble compass. Invented in the second century BC, its principles of directional guidance can be used today to create a road map to the future, one that breaks down choices that are actionable and impactful and delivers tangible results. 



Quote for the day:


"A leader should demonstrate his thoughts and opinions through his actions, not through his words." -- Jack Weatherford


Daily Tech Digest - October 21, 2019

AI Can Help You—And Your Boss—Maximize Your Potential. Will You Trust It?

Hands of robot and human touching on global virtual network connection future interface. Artificial intelligence technology concept.
First, in identifying which employees have high potential to be great performers or strong leaders. The company tells Yva which individuals it currently considers as best performers; Yva’s neural network identifies which behaviors are characteristic of these top performers, and then finds other employees who exhibit some if not all of the same traits. It can tell you who has the potential to become a top salesperson, or an extremely effective leader; and it can tell you which characteristics they already possess and which ones they need to develop. Second, Yva helps minimize “regrettable attrition” by identifying employees who are a high resignation risk. A decision to resign never comes out of the blue. First the employee will feel increasingly frustrated or burnt out; then she will become more open to consider other opportunities; then she will actively seek another job. Each stage carries subtle changes in our behavior: maybe how early we send out our first email in the morning, or how quickly we respond, or something in the tone of our messages. We can’t detect these changes, but Yva can.



7 Basic Tools of Quality using R

With the goal of satisfying customers’ needs while making profit out of it, companies and organizations have developed methodologies, frameworks and tools for addressing issues that attempt their products and services’ quality. However, there are some commonly used tools that can be applied across any industry–from the product development phase until its delivery–for solving critical quality related issues. Also referred as the 7 QC tools, the seven basic tools for quality were first conceptualized by Kaoru Ishikawa, an engineering professor at the University of Tokyo, influenced by a series of lectures given by W. Edwards Deming in the 1950’s. Ishikawa’s seven basic tools for quality correspond to a fixed set of graphical and statistical techniques helpful in solving critical quality related issues. They are widely used to fine-tune processes as part of an overall quality assurance effort. In the same way, they are considered as basic since they can be easily implemented by any person with very basic training in statistics.


How cybersecurity accelerates business growth


Simply focusing on security for PCs is outdated in the world of increased mobility. The mobile workforce is the new norm, enabling higher levels of productivity while geographically expanding your workforce— which is essential for innovation and growth. To empower their mobile workforce, organizations face new challenges such as properly vetting mobile users, deploying digital certificates to these users, and implementing the right mobile security technologies, tools and policies. Innovation requires flexibility, flexibility requires mobility, and having strong cybersecurity behind your mobile infrastructure is key to averting risk while enabling and accelerating growth. Increasingly, companies are migrating to the cloud to support enterprise and mobile workers and to enable stronger and seamless collaboration. Compared to on-premises solutions, hosted cloud environments offer significant advantages for employee collaboration and access to data from any location, which in turn, helps organizations innovate and grow.


Spending on security hardware, software, and services continues to increase

security spending growth
“The market for cybersecurity products continues to grow, growth that is renewed and reinvigorated by a C-level focus on trust. Today’s new trust environment introduces new variables that go beyond the traditional ideas of security, risk and compliance, introducing concepts of privacy and ethical business operations. “Trust is addressed in consideration of the relationships (B2B, B2C, B2E and G2C) and the attributes of interaction (people, technology, organization, culture and process). Given the complexity of implementing Trust, cybersecurity vendors are the clear beneficiaries,” said Frank Dickson, program vice president, Cybersecurity Products at IDC. Services will receive the largest share of security spending in 2019 with more than $47 billion going toward managed security services, integration services, consulting services, and IT education and training. Services will also have the fastest spending growth with a five-year CAGR of 11.2%.


Edge computing: a game changer for service providers?

Edge computing: a game changer for service providers? image
Thanks to its distributed nature, edge computing can empower service providers to offer new solutions and services that simultaneously increase revenue streams and reduce network transport costs. Consider applications that require ultra-low latency (self-driving cars) or high bandwidth (video surveillance). By leveraging edge computing, service providers can choose to bring these services to market via infrastructure-as-a-service (IaaS) or platform-as-a-service (PaaS) options – all depending on how deep they want to be in the value chain. Services of this nature cannot be offered via traditional public cloud. Although we’re still in the early stages of edge computing’s evolution, we can confidently expect a host of influential IoT use cases to break into the mainstream in the coming years. For example, the development of Augmented Reality (AR), Virtual Reality (VR) and mobile gaming applications are already enthusiastically incorporating edge computing capabilities, increasingly reaping the benefits of rapid responsiveness in the face of high-bandwidth usage.


Augmenting the Modern Workforce with Computer Vision


With the arrival of 5G and the FCC’s recently announced rural broadband initiative, there will be an increased need for in the field training and guidance for technicians that are working in rural areas that may not have the infrastructure and resources those in urban areas typically have access to. With that in mind, telecom companies can utilize computer vision technology to provide real-time training and feedback in the field for their technicians. Additionally, computer vision technologies enable companies to automate the quality control of installations to cut down on technician re-interventions and improve the customer experience. Finally, those within the QSR space can leverage computer vision to provide automated checkout solutions. These intelligent cash registers can reduce the time spent at the check-out to a few seconds. These smart checkouts are easy-to-use cash register terminals equipped with a camera and trained neural networks that are able to recognize more than 10,000 products and/or recipes. By removing the constraints of traditional checkout, the smart checkout can improve the customer experience within quick serve restaurants by reducing queues to fewer than 10 seconds per customer, in some cases.


AI and ML will become important for how organizations run their digital systems

digital systems
IDC predicts that by 2020, 30 percent of G2000 companies will have allocated capital budget equal to at least 10 percent of revenue to fuel their digital strategies. This shift toward increased funding is an important one as business executives come to recognize digital transformation as a long-term commitment. With billions of dollars invested in digital transformation initiatives, executives are now exploring the impact of their investments and asking, “What’s next?” “The next phase of digital transformation will focus on making sense of all the data so that organizations can move faster, make better decisions, and create best-in-class digital experiences,” said Buddy Brewer, GVP and GM Client Side Monitoring, New Relic. “As indicated in our research, observing and acting on insights from data collected will play a critical role in helping digitally transformed organizations truly scale and realize the benefits of modern technological advances.” Global organizations claim to be significantly progressing their digital transformation projects, with 39 percent of global respondents saying these are completed or close to completion.


Trust, control and personalization through human-centric AI

Training cycle of a machine learning model
The premise of human-centric AI is a strong conviction that artificial intelligence should serve humanity and the common good while enforcing fundamental rights such as privacy, equality, fairness and democracy. However, people can only reap the full benefits of AI if they can confidently trust that the algorithm was taught to serve their interests as opposed to those of a third-party institution or corporation. For example, instead of optimizing a recommendation engine to maximize the number of impressions, one might choose to maximize the quality of impressions. The reward to be optimized should be aligned with the user’s goals. For example, a user might be highly likely to click on a message that promotes fast food, when presented. Yet, a human-centric AI engine should take into account the user’s goals related to weight-loss or health before recommending that message to the user merely for the sake of increasing click-through rate.



The four broad types of ecosystem are classified by the operating state (open or closed) and the complexity of interconnected devices (simple or hyper-connected). While all four are distinct, each one has its own business value. Not all data can be considered open and suitable for public consumption, however. While sharing global temperature sensor information is in the public interest, a vehicle’s specific location from ANPR cameras is perhaps not. One organisation that encourages and promotes the safe, open, exchange of information in communities of interest is The Open Data Institute (ODI), set up by Sir Tim Berners-Lee in 2012. This independent, not-for-profit organisation works with companies and governments to build an open, trustworthy data ecosystem where people can make better decisions using data. Currently, there is a rise in the number of organisations collaborating to provide new services to consumers, such as showing available electric car charging points on a map to make their lives easier. There are numerous new commercial ventures for aggregating and analysing large volumes of data ranging across healthcare, smart cities, agriculture, logistics, utilities and smart buildings, to name a few.



Deepfakes and voice as the next data breach

We’ve seen how quickly deepfake videos can catch on, with tools like social media allowing them to spread like wildfire. Recent examples have included an altered video of House Speaker Nancy Pelosi slurring her words, as well as footage of Facebook’s Mark Zuckerberg giving a speech on the power of big data, actor Bill Hader doing an impression of Tom Cruise, and actress Jennifer Lawrence giving a speech with Steve Buscemi’s face. Not all of these deepfake videos had malicious intent, but show how prevalent and mainstream deepfakes are becoming, and the potential for bad actors to leverage the technology to perpetrate crimes. Prominent figures like these are easy to target because they have so much public content available online that can be repurposed for deepfakes, but as this technology continues to advance, it won’t be long before criminals have the tools to expand their targets beyond world leaders and celebrities. 



Quote for the day:


"Leadership without mutual trust is a contradiction in terms." -- Warren Bennis


Daily Tech Digest - October 20, 2019

3 Concepts Defining the Future of Work: Data, Decentralisation and Automation

3 Concepts Defining the Future of Work: Data, Decentralisation and AutomationData analytics can help to interpret the business environment, enable managers to act and result in sustained superior performance and competitive advantage. The introduction of descriptive, predictive and prescriptive analytics in your work means that the traditional way of decision-making, based on experience and expertise, is exchanged for data-driven decision-making. When organisations provide more people with access to knowledge, power is distributed more equally, enabling employee empowerment within an organisation. This power shift is necessary to fully benefit from big data analytics. The future of work, therefore, will result in flatter organisations. Where those employees facing the customer or those directly involved in building a product use data to optimise their decisions. This requires a change in company culture, as real-time insights from data require real-time action from employees and management. Fewer managers and more empowered employees will radically change your culture.


7 steps to a successful ISO 27001 risk assessment - IT Governance Blog

An information security risk assessment is the process of identifying, resolving and preventing security problems. Your organisation’s risk assessor will identify the risks that your organisation faces and conduct a risk assessment. The risk assessment will often be asset based, whereby risks are assessed relative to your information assets. It will be conducted across the whole organisation. ISO 27001 is explicit in requiring that a risk management process be used to review and confirm security controls in light of regulatory, legal and contractual obligations. ... ISO 27001 does not prescribe a specific risk assessment methodology. Choosing the correct methodology for your organisation is essential in order to define the rules by which you will perform the risk assessment. The methodology needs to address four issues: baseline security criteria, risk scale, risk appetite, and a scenario-based or asset-based risk assessment. ... If opting for an asset-based risk assessment, you should work from an existing list of information assets, which includes hard copies of information, electronic files, removable media, mobile devices and intangibles, such as intellectual property.


Friendly reminder: Biometrics are not the best way to secure your phone

ZTE Axon 10 Pro in-display fingerprint sensor
If there’s ever an appropriate time to call a gigantic tech conglomerate “red faced,” it’s probably now. In a terse statement released yesterday, Samsung acknowledged some clear cases and screen protectors can be used to bypass the fingerprint sensors on the Galaxy S10, Galaxy 10 Plus, Galaxy S10 5G, Galaxy Note 10, and Galaxy Note 10 Plus. You don’t need a 3D printer, super-high-res camera, latex molds, or any cloak-and-dagger nonsense. A dirt-cheap phone case is all you need to unlock someone’s Samsung flagship. It’s hard to excuse this massive breach of trust, and it’s even harder to understand why Samsung has so far failed to apologize to customers. Yet, this embarrassing mishap isn’t that surprising in the scheme of things. The truth is, fingerprints and other biometric authentication methods are flawed. You shouldn’t rely on them if you actually care about mobile security. PINs and passwords are much more secure — if less convenient — methods of authentication.


Six Reasons Why We Need to Take a Collaborative Approach to AI Development

Wisdom of the crowd is a known phenomenon, but despite that companies focus on selecting and working with people at the top who are highly specialized. This is under the assumption that they have a better and more correct knowledge, which is partially true, as knowledge was not easily accessible to all. However, with online education, knowledge, especially in the field of AI, is now accessible to all. One does not have to go to a university to learn the same thing as someone at Stanford or MIT can learn. This has again made the playing field flat and led to the democratization of AI. ... ‘What seems chaos at first, one starts to appreciate the experience and realize the relevance of working in self-organizing environment, where crucial conclusions and optimal solutions eventually emerge as winners. It is very well welcomed and valuable experience. It makes me proud to be a part of it.’- Vjeko Hofman, 10+ years of experience as a Software Engineer.


These are all the ways that remote working is stressing you out


One of the reasons for this could be the “out of sight, out of mind” mentality that’s commonplace toward remote workers, which leads to a lack of trust, feelings of being an outsider, and a tendency for people to think their colleagues are talking negatively about them behind their back. One study of 1,100 workers found that the 52% who worked from home at least some of the time were more likely to feel left out and mistreated, as well as unable to deal with conflict between themselves and colleagues. Navigating sensitive territory in a virtual team is an essential skill. If we’re not careful, issues can fester. Emails can be misinterpreted as being rude or too direct. And, with no visible body language, it’s tricky to convey our true meanings. In a virtual environment there is a tendency to focus too much on tasks and too little on relationships. This kind of transactional leadership can be the route taken by leaders who want to get the job done but fail to recognize how important the people are who are completing these tasks.


Are We Doomed to Repeat the Past When it Come to Hacking?

On an almost weekly basis, another organization or government agency owns up to having been “hacked” – admitting that its systems have been breached. For every company that discloses an issue, there are likely 20 – 30 more that keep it under wraps. We know this because more than half of all U.S. businesses have been hacked. The attacker may have removed sensitive personal data or trade secrets for later sale on the dark web, or sought to disrupt operations, causing negative reputational and financial impact. But regardless of attacker motivation, cybercrime damages are predicted to cost the world $6 trillion dollars in damages annually by 2021. George Santayana gave us the great quote, “Those who cannot remember the past are condemned to repeat it.” Unfortunately, we haven’t been particularly good students of history – at least in terms of protecting our critical infrastructure from hackers. ... Known as “cyberhardening,” this method prevents a single exploit from propagating across multiple systems. It shrinks attack surfaces, eliminates vulnerabilities, and stops malware from being executed. Read more about this transformation process here.


The Fundamentals of Cyber Risk Management


“Organisations should invest in identifying and updating their crown jewels (using parameters like whether the assets are internet exposed, hosted on cloud, managed by third party etc.) and associated threats at regular intervals.” This includes “a comprehensive view of where all key information assets reside, whether self-managed or managed by a third party, including any unstructured data (e.g., spreadsheets, documents, PDFs, emails, etc.). This should include risk classification and level of granularity, appropriate to the entity’s size and complexity, plus current risk controls.” You’ll notice that Mistry hones in on an important asset management aspect of cyber-risk: third party vendors. “Organisations should implement a third party tiering system which can be created using parameters like the location of services, number of records accessed, data type accessed, etc,” he says. “Tiering of the third parties can help determine the frequency of controls assessment and the level of evidence required to ascertain the security posture of third parties.”


U.S. Financial Services Cyber Security Market Insight & Future Assessment

This report is a resource for executives with interests in the cyber security industry. It has been explicitly customized for the cyber security industry and financial services decision-makers to identify business opportunities, developing technologies, market trends and risks, as well as to benchmark business plans. Considering the economic and business implications of cyber attacks, it has now become mandatory for the financial industry to significantly increase its investments in state-of-the-art cyber security technologies, solutions, and outsourced services to detect, prevent, analyze and resolve the epidemic of financial cyber crime. According to the Cyber Security Market Report "U.S. Financial Services: U.S. Financial Services: Cybersecurity Systems & Services Market - 2016-2020" report, the U.S. financial institutions cyber security market is the largest and fastest growing private sector cyber security market. Its cumulative 2016-2020 market size is forecasted to exceed $68 Billion.


Guerrilla Analytics – how to deliver analytics in the cut & thrust of business

Guerrilla Analytics – how to deliver analytics in the cut & thrust of business
“Guerrilla Analytics is data analytics performed in a very dynamic project environment that presents the team with varied and frequent disruptions and constrains the team in terms of the resources they can bring to bear on their analytics problem.“ Most analytics leaders I know can relate to that reality, whether or not they formally work in an Agile environment. Enda goes on to outline, in useful detail the risks and challenges that such an approach needs to address. Demonstrating why more than a general CRISP-DM approach is needed as a working methodology. As a foundation for the rest of this book he then explains the 7 principles of Guerrilla Analytics. These cover practical day to day decisions about storage, documentation, automation, audit-able work, knowledge management & code design. ... Starting with Data Extraction, Enda shows how his 7 principles can be applied to improve practice. Some of the examples get into very specific detail. But the themes and lessons learnt help avoid this becoming too technical or distracting.


Data management strategies are evolving – so must enterprises


“Data has been placed in higher value than oil, and we, as humans, create more than 2.5 exabytes of data every single day,” says Tim Galligan ... “This data boom has truly created a need in the enterprise to secure and manage its own data. The more data we have to manage, the trickier it becomes to properly govern who is accessing that data, what we’re doing with it and how secure it actually is.” Because of the increased data-sharing capabilities within organisations, GRC is now evolving into integrated risk management (IRM), which provides a far more holistic approach to an organisation’s data security procedures. “The pervasive nature of sensitive data, along with the related security and privacy issues it brings, is a drive of this movement,” says Doug Wick, vice-president of product at ALTR. IRM is a set of procedures that enables a risk-aware organisation to use technology and strategy to speed up decision-making and performance, through human intervention and automated playbooks – used to define a scenario and then create actions from it to play out the process – to prevent a situation from spreading.



Quote for the day:

"The smartest people in the meeting are the ones that don't say anything until they have something of value to say." -- @LeadToday

Daily Tech Digest - October 19, 2019

Lip-Reading Drones, Emotion-Detecting Cameras: How AI Is Changing The World


Specific lip-reading programs can decipher what people are saying from a distance while gait-analysis software can identify an individual just by the way they walk. "Even if the drone is at 300ft, it can still operate effectively,” Dronestream CEO Harry Howe said. While these particular drones are still in the testing phase, many intruding technologies are being used around the country. Take China, for example. It's Skynet system claims it can scan all 1.3 billion citizens within seconds. There are 200 million cameras scattered around the country which can track identity thieves, find fugitives, catch sleeping students and spot jaywalkers. This particular surveillance system led to 2000 arrests from 2016 to 2018. Countries like Malaysia, Jamaica, Germany and Poland are considering installing similar systems, while a number of facial recognition trials have been conducted right here on Australian soil.



7 mistakes that ISO 27001 auditors make

Checklists are a great way of quickly assessing whether a list of requirements are met, but what they offer in convenience they lack in in-depth analysis. Organisations are liable to see that a requirement has been ticked off and assume that it’s ‘mission accomplished’. However, there may still be room to improve your practices, and it might even be the case that your activities aren’t necessary. A good auditor will use the checklist as a summary at the beginning or end of their audit, with a more detailed assessment in their report, or they’ll use a non-binary system that doesn’t restrict them to stating that a requirement either has or hasn’t been met. ...  In theory, they are a perfect fit. You already have a working relationship and you’ll save time finding a consultant and bringing them up to speed on your organisation’s needs. Unfortunately, there’s clearly a conflict of interest in this relationship, as you run the risk of allowing the auditor to manipulate their findings to persuade you to use them as a consultant.


Looking at the Enterprise Architecture Renaissance

Looking-at-the-enterprise-architecture-renaissance
In their enterprise architecture report, Ovum looked at the paradigm shift going on now that’s responsible for transforming EA into architect everything. They reviewed seven EA solutions that have begun the transition from EA to AE. Interestingly, Ovum found that the vendors shared a similar idea on the direction that EA should move toward. Most regarded non-EA features that help with business modeling, business process mapping and analysis, GRC, and portfolio management to be standard features that EA platforms should include in their solutions. ... Today’s enterprise architecture approach needs to promote stronger collaboration and teamwork throughout the organization, so that everyone is on the same page with regard to company goals and desired outcomes. One example on an EA platform that does this effectively is Planview Enterprise One. Planview Enterprise One comes with collaboration and workflow tools that enable process and project-driven work. Elements like Kanban boards and collaborative workspaces make it easy to bring stakeholders and contributors together under one roof, where they can share information and work together to push the company forward.


Top 6 email security best practices to protect against phishing attacks ...


Complicated email flows can introduce moving parts that are difficult to sustain. As an example, complex mail-routing flows to enable protections for internal email configurations can cause compliance and security challenges. Products that require unnecessary configuration bypasses to work can also cause security gaps. As an example, configurations that are put in place to guarantee delivery of certain type of emails (eg: simulation emails), are often poorly crafted and exploited by attackers. Solutions that protect emails (external and internal emails) and offer value without needing complicated configurations or emails flows are a great benefit to organizations. In addition, look for solutions that offer easy ways to bridge the gap between the security teams and the messaging teams. Messaging teams, motivated by the desire to guarantee mail delivery, might create overly permissive bypass rules that impact security. The sooner these issues are caught the better for overall security. Solutions that offer insights to the security teams when this happens can greatly reduce the time taken to rectify such flaws thereby reducing the chances of a costly breach


How operators can make 5G pay


Some operators have started to partner with over-the-top (OTT) service providers to bundle their offerings with connectivity subscriptions, sometimes with an explicit charge and sometimes without (for example, by making certain streams unmetered against the customer’s data bundle). “With the improvements in network capabilities in the 5G era, customers can expect to enjoy more network services bundled with content provider services — including accelerated gaming — and the operator could offer its network service to the customer as part of that bundle,” said a senior executive at an Asian Internet player. In the 5G world, in which the network technology allows a far greater range of functionality that can be monetized, telecom companies have many more opportunities to develop collaborations with a variety of businesses and public agencies. We see four main options for how operators could monetize this greater functionality. The higher the relevance of the telecom operator’s brand to the use case, the greater the operator’s ability to own the customer relationship and claim a bigger share of revenues.


Beyond their value in ensuring consistent, predictable service delivery, SLOs are a powerful weapon to wield against micromanagers, meddlers, and feature-hungry PMs. That is why it’s so important to get everyone on board and signed off on your SLO. When they sign off on it, they own it too. They agree that your first responsibility is to hold the service to a certain bar of quality. If your service has deteriorated in reliability and availability, they also agree it is your top priority to restore it to good health. Ensuring adequate service performance requires a set of skills that people and teams need to continuously develop over time, namely: measuring the quality of our users’ experience, understanding production health with observability, sharing expertise, keeping a blameless environment for incident resolution and post-mortems, and addressing structural problems that pose a risk to service performance. They require a focus on production excellence, and a (time) budget for the team to acquire the necessary skills. The good news is that this investment is now justified by the SLOs that management agreed to.


How open source software is turbocharging digital transformation


Make no mistake, the ever-expanding palette of vendor solutions on the market today remains an indispensable resource for enterprise-scale digital transformation. But there are compelling reasons to explore OSS’s possibilities as well. For example, OSS in emerging technology domains often includes work contributed by highly creative developers with hard-to-find expertise. By exploring OSS projects for artificial intelligence (AI), blockchain, or other trending technologies, companies that lack in-house experts can better understand what the future holds for these disruptive tools. Moreover, CIOs are realizing that when coders can engage with domain experts and contribute their own work to an OSS ecosystem, job satisfaction and creativity often grow, along with engineering discipline, product quality, and efficiency. As any software engineer knows, the ability to take established and tested code from an existing library, rather than having to create it from scratch, can shrink development timelines significantly. These findings spotlight OSS’s formidable promise. But they also make clear that open source is not an all-or-nothing proposition. IT leaders should think of OSS as a potentially valuable complement to their broader ecosystem, vendor, or partner strategy.


Yubico security keys can now be used to log into Windows computers


Starting today, users can use hardware security keys manufactured by Swedish company Yubico to log into a local Windows OS account. After more than six months of testing, the company released today the first stable version of the Yubico Login for Windows application. Once installed on a Windows computer, the application will allow users to configure a Yubico security key (known as YubyKey) to secure local Windows OS accounts. The Yubico key will not replace the Windows account password but will work as a second authentication factor. Users will have to enter their password, and then plug in a Yubico key into a USB port to finish the login process. Yubico hopes the keys will be used to secure high-value computers storing sensitive data that are used in the field, away from secured networks. Such devices are often susceptible to theft or getting lost. If the devices are not encrypted, attackers have various ways at their disposal to bypass normal Windows password-based authentication. Securing local Windows accounts with a YubiKey makes it nearly impossible for an attacker to access the account, even if they know the password.


The Fallacy of Telco Cloud

First, proving the viability of virtualizing Telco workloads, with the investment in defining Network Function Virtualization (NFV) and a global set of trials, beginning in and around the first ETSI NFV working group meeting in 2012. Then, we focused on the optimization of that virtualization technology – investment in Virtual Infrastructure Managers (VIMs), I/O acceleration technologies like Data Plane Development Kit (DPDK), and para-virtualization technologies, such as Single Root Input/Output Virtualization (SR-IOV) for performance and manageability of SLA-backed network functions. Now, we’ve embarked on the next set of technology advancements: separating control and user planes, accelerating I/O functions with FPGAs and SmartNICs, and starting the migration of applications towards containers and cloud native functions. This is the beginning of a second wave of technology-led investments into the Telco Cloud. ... In short, the technology is mature. The real question is – are we actually achieving the benefits of cloud in the Telco network? 


Challenges of Data Governance in a Multi-Cloud World


The traditional contracts that worked in typical telecom network services to mitigate security breaches or other types of noncompliance events have failed to deliver the goods for the cloud. Highly scaled, shared, and automated IT platforms, such as the cloud can hide the geographic location of data — both from the customer and the service provider’s sides. This can give rise to regulatory violations. Thus, contracting for the cloud is still in its infancy, and till some litigation sheds light on regulatory issues and serves to set precedents for future cases, the data-cloud breach issues will remain unresolved. Moreover, data aggregation will increase the potential data risk as more valuable data will occupy the common storage location. On the flip side, multi-cloud environments offer more transparency through event logging, and enterprise-wide solutions via automation tools. Solutions, once detected, can be instantly deployed across cloud networks. In recent years, risk management strategies specifically for the cloud have emerged, and these just have to be tested for the multi-cloud environments.



Quote for the day:


"There are some among the so-called elite who are overbearing and arrogant. I want to foster leaders, not elitists." -- Daisaku Ikeda


Daily Tech Digest - October 18, 2019

Critical PDF Warning: New Threats Leave Millions At Risk

Keyboard key - pdf file
The PDFex vulnerability exploits ageing technology which was not designed with contemporary security considerations in mind. In essence, taking advantage of the very universality and portability of the PDF format. And while it might seem like a fairly specific attack, most companies rely on secured PDF documents for the transmission of contracts, board papers, financial documents, transactional data. There is an expectation that such documents are secure. Clearly, they are not. The PDFex attack is designed to exfiltrate the encrypted data to the attacker when the document is opened with a password—being decrypted in the process. The PDFex researchers, “in cooperation with the national CERT section of BSI,” have contacted all vendors, “provided proof-of-concept exploits, and helped them fix the issues.” Of even more concern, is the multiple vulnerabilities that have been disclosed and which impact the popular Foxit Reader PDF application specifically—Foxit claims it has 475 million users. Affecting Windows versions of Foxit’s reader, the vulnerabilities enable remote code execution on a target machine. 


Much-attacked Baltimore uses ‘mind-bogglingly’ bad data storage

After the attack in May, Baltimore Mayor Bernard C. “Jack” Young not only refused to pay, he also sponsored a resolution, unanimously approved by the US Conference of Mayors in June 2019, calling on cities to not pay ransom to cyberattackers. Baltimore’s budget office has estimated that due to the costs of remediation and system restoration, the ransomware attack will cost the city at least $18.2 million: $10 million on recovery, and $8.2 million in potential loss or delayed revenue, such as that from property taxes, fines or real estate fees. The Robbinhood attackers had demanded a ransom of 13 Bitcoins – worth about US $100,000 at the time. It may sound like a bargain compared with the estimated cost of not caving to attackers’ demands, but paying a ransom doesn’t ensure that an entity or individual will actually get back their data, nor that the crooks won’t hit up their victim again. The May attack wasn’t the city’s first; nor was it the first time that its IT systems and practices have been criticized in the wake of attack.


'The Dukes' (aka APT29, Cozy Bear) threat group resurfaces with three new malware families

'The Dukes' threat group resurfaces with three new malware families
According to researchers, three new malware samples, dubbed FatDuke, RegDuke and PolyglotDuke, linked to a cyber campaign most likely run by APT29. The most recent deployment of these new malwares was tracked in June 2019. The ESET researchers have named all activities of Apt29 (past and present) collectively as Operation Ghost. This cyber campaign has been running since 2013 and has successfully targeted the Ministries of Foreign Affairs in at least three European countries. The researchers compared the techniques and tactics used by APT29 in its recent attacks to those used in group's older attacks. They found many similarities in these campaigns, including the use of Windows Management Instrumentation for persistence, use of steganography in images to hide communications with Command and Control (C2) servers, and use of social media, such as Reddit and Twitter, to host C2 URLs. The researchers also found similarities in the targets hit during the newer and older attacks - ministries of foreign affairs.


Misconfigured Containers Open Security Gaps

Image: Coloures Pic - stock.adobe.com
The knowledge gap surrounding security risks and the blunders it causes are, by far, the biggest threat to organizations using containers, observed Amir Jerbi, co-founder and CTO of Aqua Security, a container security software and support provider. "Vulnerabilities in container images -- running containers with too many privileges, not properly hardening hosts that run containers, not configuring Kubernetes in a secure way -- any of these, if not addressed adequately, can put applications at risk," he warned. Examining the security incidents targeting containerized environments over the past 18 months, most were not sophisticated attacks but simply the result of IT neglecting basic best practices. he noted. ... While most container environments meet basic security requirements, they can also be more tightly secured. It's important to sign your images, suggested Richard Henderson, head of global threat intelligence for security technology provider Lastline. "You should double-check that nothing is running at the root level."


Microsoft and Alibaba Back Open Application Model for Cloud Apps


OAM is a standard for building native cloud applications using "microservices" and container technologies, with a goal of establishing a platform-agnostic approach. It's kind of like the old "service-oriented architecture" dream, except maybe with less complexity. The OAM standard is currently at the draft stage, and the project is being overseen by the nonprofit Open Web Foundation. Microsoft apparently doesn't think too highly of the Open Web Foundation as its "goal is to bring the Open Application Model to a vendor-neutral foundation," the announcement explained. Additionally, Microsoft and Alibaba Cloud disclosed that there's an OAM specification specifically designed for Kubernetes, the open source container orchestration solution for clusters originally fostered by Google. This OAM implementation, called "Rudr," is available at the "alpha" test stage and is designed to help manage applications on Kubernetes clusters. ... Basic OAM concepts can be found in the spec's description. It outlines how the spec will account for the various roles involved with building, running and porting cloud-native apps.


Why AI Ops? Because the era of the zettabyte is coming.


“It’s not just the amount of data; it’s the number of sources the data comes from and what you need to do with it that is challenging,” Lewington explains. “The data is coming from a variety of sources, and the time to act on that data is shrinking. We expect everything to be real-time. If a business can’t extract and analyze information quickly, they could very well miss a market or competitive intelligence opportunity.” That’s where AI comes in – a term originally coined by computer scientist, John McCarthy, in 1956. He defined AI as “the science and engineering of making intelligent machines.” Lewington thinks that the definition of AI is tricky and malleable, depending on who you talk to. “For some people, it’s anything that a human can do. To others, it means sophisticated techniques, like reinforcement learning and deep learning. One useful definition is that artificial intelligence is what you use when you know what the answer looks like, but not how to get there.” No matter what definition you use, AI seems to be everywhere. Although McCarthy and others invented many of the key AI algorithms in the 1950s, the computers at that time were not powerful enough to take advantage of them.


Fake Tor Browser steals Bitcoin from Dark Web users


Purchases made in these marketplaces are usually done so using cryptocurrency such as Bitcoin (BTC) in order to mask the transaction and user's identity. If a user visits these domains and tries to make a purchase by adding funds to their wallet, the script activates and attempts to change the wallet address, thereby ensuring funds are sent to an attacker-controlled wallet instead. The payload will also try to alter wallet addresses offered by Russian money transfer service QIWI. "In theory, the attackers can serve payloads that are tailor-made to particular websites. However, during our research, the JavaScript payload was always the same for all pages we visited," the researchers say. It is not possible to say how widespread the campaign is, but the researchers say that PasteBin pages promoting the Trojanized browser have been visited at least half a million times, and known wallets owned by the cybercriminals have 4.8 BTC stored -- equating to roughly $40,000. ESET believes that the actual value of stolen funds is likely to be higher considering the additional compromise of QIWI wallets.


Server Memory Failures Can Adversely Affect Data Center Uptime

The Intel® MFP deployment resulted in improved memory reliability due to predictions based on the capture of micro-level memory failure information from the operating system’s Error Detection and Correction (EDAC) driver, which stores historical memory error logs. Additionally, by predicting potential memory failures before they happen, Intel® MFP can help improve DIMM purchasing decisions. As a result, Tencent was able to reduce annual DIMM purchases by replacing only DIMMs that have a high likelihood to cause server crashes. Because Intel® MFP is able to predict issues at the memory cell level, that information can be used to avoid using certain cells or pages, a feature known as page offlining, which has become very important for large scale data center operations. Tencent was therefore able to improve their page offlinging policies based on Intel® MFP’s results. Using Intel® MFP, server memory health was analyzed and given scores based on cell level EDAC data.


Three Keys To Delivering Digital Transformation

growth
More digitally mature organisations are beginning to view digital transformation as not just an internal technology infrastructure upgrade. It is more than an opportunity to move costly in-house capabilities to the cloud, or shift sales and marketing to online multi-channel provision. The focus today is on a more fundamental review of business practices, a realignment of operations toward core values, and a stronger relationship between creators and consumers of services. Within this context, digital modernisation programmes taking place across many organisations are accelerating the digitisation of their core assets, rebalancing spending toward digital engagement channels, fixing flaws in their digital technology stacks, and replacing outdated technology infrastructure with cloud-hosted services. Such programmes are essential for organisations to remain competitive and relevant in a world that increasingly rewards those that can adapt quickly to market changes, raise the pace of new product and service delivery, and maintain tight stakeholder relationships.


Virtual voices: Azure's neural text-to-speech service


Microsoft Research has been working on solving this problem for some time, and the resulting neural network-based speech synthesis technique is now available as part of the Azure Cognitive Services suite of Speech tools. Using its new Neural text-to-speech service, hosted in Azure Kubernetes Service for scalability, generated speech is streamed to end users. Instead of multiple steps, input text is first passed through a neural acoustic generator to determine intonation before being rendered using a neural voice model in a neural vocoder. The underlying voice model is generated via deep learning techniques using a large set of sampled speech as the training data. The original Microsoft Research paper on the subject goes into detail on the training methods used, initially using frame error minimization before refining the resulting model with sequence error minimisation. Using the neural TTS engine is easy enough. As with all the Cognitive Services, you start with a subscription key and then use this to create a class that calls the text-to-speech APIs.



Quote for the day:


"A person must have the courage to act like everybody else, in order not to be like anybody." -- Jean-Paul Sartre