Daily Tech Digest August 23, 2019

Don’t worry about shadow IT. Shadow IoT is much worse

fight shadow
So far so good. But this reasoning emphatically does not carry over to the emerging practice of shadow IoT, which has become a growing concern in the last year or so. Basically, we are talking about when people in your organization add internet-connected devices (or worse, entire IoT networks!) without IT’s knowledge. Those renegades are likely seeking the same speed and flexibility that drove shadow IT, but they are taking a far bigger risk for a much smaller reward. Shadow IoT takes shadow IT to another level, with the potential for many more devices as well as new types of devices and use cases, not to mention the addition of wholly new networks and technologies. According to a 2018 report from 802 Secure, “IoT introduces new operating systems, protocols and wireless frequencies. Companies that rely on legacysecurity technologies are blind to this rampant IoT threat. Organizations need to broaden their view into these invisible devices and networks to identify rogue IoT devices on the network, visibility into shadow-IoT networks, and detection of nearby threats such as drones and spy cameras.”




GraphQL: Evolution of Modern Age Database Management System

Consider we have an API that is consumed by a responsive web application. Depending upon the device where the app has opened, the data displayed on the screen will vary, some fields may be hidden on smaller screen devices and all the fields may be shown on devices with a larger screen. The following infographic explains various pain points, pertaining to manage data returned when using REST: With GraphQL, it’s just a matter of querying the fields that are required depending upon the device. The front end is in control of the data it requests, and the same principles can be utilized for any GraphQL API. With GraphQL, we can easily aggregate data from different sources and serve it up to the users under a single umbrella rather than making multiple REST calls from the front end or back end. We can expose a legacy system through a unified API. As mentioned above, every GraphQL API is composed of types, schema and resolvers. In order to create an API, we require to create a GraphQL server in some language. As you know by now, GraphQL is a language in its own and has a specification, the specification must be implemented in a programming language to be utilized.


Cybercriminals Leveraging Evasion And Anti-Analysis Techniques To Avoid Detection

Cybercriminals leveraging evasion and anti-analysis techniques to avoid detection: Study - CIO&Leader
Many modern malware tools already incorporate features for evading antivirus or other threat detection measures, but cyber adversaries are becoming more sophisticated in their obfuscation and anti-analysis practices to avoid detection. For example, a spam campaign demonstrates how adversaries are using and tweaking these techniques against defenders. The campaign involves the use of a phishing email with an attachment that turned out to be a weaponized Excel document with a malicious macro. The macro has attributes designed to disable security tools, execute commands arbitrarily, cause memory problems, and ensure that it only runs on Japanese systems. One property that it looks for in particular, an Excel Date variable, seems to be undocumented. Another example involves a variant of the Dridex banking trojan which changes the names and hashes of files each time the victim logs in, making it difficult to spot the malware on infected host systems. The growing use of anti-analysis and broader evasion tactics is a reminder of the need for multi-layered defenses and behavior-based threat detection.


Security pros reiterate warning against encryption backdoors


“We know that encryption backdoors dramatically increase security risks for every kind of sensitive data, and that includes all types of data that affects our national security. The IT security community overwhelmingly agrees that encryption backdoors would have a disastrous impact on the integrity of our elections and on our digital economy as a whole.” Opponents of encryption backdoors have said repeatedly that government-mandated weaknesses in encryption systems put the privacy and security of everyone at risk the same backdoors can be exploited by hackers. The survey also shows that 70% of the Black Hat USA respondents believe countries with government-mandated encryption backdoors are at an economic disadvantage in the global marketplace, while 84% would never knowingly use a device or program from a company that agreed to install a backdoor. Bocek added: “On a consumer level, people want technology that prioritises the security and privacy of their personal data. ...”


For Sale on Cybercrime Markets: Real 'Digital Fingerprints'

The marketplace says it too has listings from vetted vendors selling "bot logs with fingerprints" for PayPal, Pornhub and Facebook accounts; U.S. and Canadian bank accounts; cryptocurrency hot and cold wallets; and AirBnB accounts. "All the logs we provide come with full fingerprint data," an advertisement for the site boasts. "This means exactly that. We provide all the cookie & browsing history in each log to ensure you have success with all your operations. Once you load the cookies onto your selected browser you will become the identical user from the log you have just purchased from us. This ensures a much higher success rate and anonymity for your business needs. We provide you with a free browser where you can just upload your purchased log and use it with simplicity." Registration for Richlogs costs $50, payable via bitcoin or monero, after which the site says a user will receive $100 in site credit. Log data starts at $1 per record.vThe site also says that it can give users real-time access to hacked PCs so they can use them to remotely emulate victims via a SOCKS5 proxy.


Open-source spyware makes it on the Google Play Store

Google Play and Apple App Store
"The malicious functionality in AhMyth is not hidden, protected, or obfuscated," said Lukáš Štefanko, malware researcher at ESET, who conducted the investigation into the malicious app. "For this reason, it is trivial to identify the Radio Balouch app - and other derivatives - as malicious and classify them as belonging to the AhMyth family." "Nothing special was used to bypass either Google's IP or postpone the malicious function. I think it wasn't detected because users first had to set up the app - set the language, allow permissions, go through a couple of 'next' buttons, for an app overview and only then would the malicious code be launched," he told ZDNet. Štefanko said ESET spotted two instances of the malware being uploaded on the Play Store, one on July 2, and the second on July 13. Both were removed within a day, but only after they contacted the Play Store staff. While the two apps never managed to get more than 100 installs, the problem here was the fact that they ended up on the Play Store using nothing more than unobfuscated open-source code.


How IT departments can upskill in the new economy


Working in the gig economy works both for small businesses and startups, and large enterprises and public sector organisations. Yorkshire Water is one of the businesses mentioned in the TopCoders report. The water utility firm opened up 12 months of its data through the Leeds Open Data Institute to crowd-source the discovery of new trends or patterns. According to Yorkshire Water, it received a number of interesting submissions, such as an app proposal to use artificial intelligence (AI) to automate the recognition of leak noise, and a Fitbit-like device for monitoring water usage in household water pipes. New research has found that crowd-sourcing ideas for the smart use of public sector data offers a huge economic benefit. In July, the European Union (EU) reported that the total direct economic value of the data held in the public sector is expected to increase from a baseline of €52bn in 2018 to €194bn in 2030. Yorkshire Water recently sponsored a hackathon, in which software development teams were invited to take part in a competition focused on ideas for using open data to create a county-wide data dashboard for Yorkshire.


How to protect yourself and your organization against digital identity fraud


Continuously monitor digital identity markets. Monitoring these markets can help you identify compromised identities early so you can more diligently monitor traffic and/or enhance the verification methods for user logins. "Organizations can use services of threat intelligence companies such as IntSights to monitor their assets on markets such as Richlogs and Genesis," Ariel Ainhoren, head of research for IntSights, told TechRepublic. "Getting to them is not too complicated, and with decent protection measures such as a VPN, it is not too risky as well. But most small-medium organizations won't have the time to invest in monitoring these markets and the other dark web activity. They will usually notice that something is wrong only after threat actors will make use of these identities. This is what makes digital identity fraud and these markets so dangerous." Enable two-factor authentication. Asking for a second (or even third) variable to authenticate your users makes it more difficult for hackers to access your accounts. You might adopt a form of mobile verification or ask security questions that only the user would know.


Cloud Security: Mess It Up and It's on You


The majority of incidents are the result of errors and misconfigurations by cloud service providers' clients, Heiser says. That's due in part to how fundamental security practices of the past don't apply to cloud computing models. Take Capital One, a financial services company that's been one of the most aggressive in that vertical in embracing the cloud. A Seattle woman, Paige A. Thompson, is accused of taking advantage of a misconfigured firewall to gain access to more than 100 million credit card applications going back to 2003. She allegedly gained access credentials for a role with expansive permissions, including the area where the credit card applications were stored. AWS, Capital One's service provider, has indicated the problem wasn't on its side. "Security in the cloud is your responsibility, whether or not your cloud service provider makes it easy for you," Heiser says. "Cloud service providers do a great job of drawing the line between their responsibility and your responsibility. And so far, AWS has done a great job of always blaming their customers. So be forewarned."


Three Tips for Laying the Groundwork for Machine Learning

Image: NicoElNino - stock.adobe.com
Despite on-premise IT infrastructure’s ability to host many open-source frameworks to create ML solutions, many organizations still lack the power and scalability to support them. If an organization is evaluating ML for a project, hyperscale cloud might be a good option to consider, since it offers consumption-based access to graphics processing unit (GPU) compute, which can dramatically accelerate the process of training a deep learning algorithm. Once the requirement moves from batch analysis to real time, the flow of relevant data must keep pace with ML algorithms working in near real-time. Ensuring that workloads are supported throughout a project’s lifecycle and organizations have the ability to experiment with ML capabilities is essential, and cloud elasticity can be used to address that. It has never been easier for organizations to expand into the cloud, as the big three public cloud providers -- AWS, Google and Amazon -- all fight for ML business. Despite this, organizations still lag behind in exploiting the elastic scalability of the cloud to derive value from their organization’s data with ML.



Quote for the day:


"The final test of a leader is that he leaves behind him in other men, the conviction and the will to carry on." -- Walter Lippmann


Daily Tech Digest - August 22, 2019

Fake VPN Website Delivers Banking Trojan

Fake VPN Website Delivers Banking Trojan
Reseachers at the Russian security firm Doctor Web say the banking Trojan, which they call Win32.Bolik.2, is a modified version of the original Win32.Bolik.1 Trojan and is being spread by the same unknown hacker group. "The Win32.Bolik.2 Trojan is an improved version of Win32.Bolik.1 and has qualities of a multicomponent polymorphic file virus," the researchers write. "Using this malware, hackers can perform web injections, traffic intercepts, key-logging and steal information from different bank client systems." The new campaign, which started on Aug. 8, is targeting mainly English-speaking victims by using a cloned website of NordVPN that prompts its visitors to download a program for a particular type of VPN software that contains the Trojan, Doctor Web reports. To make the fake website seem authentic, the fraudsters use a valid Secure Sockets Layer certificate that ensures a secure connection between a web server and a browser. Another campaign in April using the same Trojan spread via the legitimate website for a video editing software package.


Big Switch targets shadow IT, hybrid cloud growth with fortified software family

data center / server racks / connections
Big Switch’s flagship BCF software lets customers manage physical switches as a single fabric that includes security, automation, orchestration and analytics. BCF can run on a variety of certified switches from Dell EMC, HPE and others. In addition, BCF Controller natively supports integration with various Cloud Management Platforms such as VMware (vSphere, NSX Manager, vSAN) and OpenStack. BCF also supports container orchestrators such as Kubernetes, all via a single interface.  “We believe that traditional networking is complicated but that networking in the cloud is easier using technology such as VPCs, and what we are doing is bringing cloud-networking principles to on-prem and the data center,” said Prashant Gandhi, chief product officer for Big Switch. With BCF for AWS customers can discover, determine configurations and troubleshoot all VPCs and workloads configured in AWS, Gandhi said. “One of the benefits to BCF for AWS is that many customers deploying workloads in clouds don’t know everything that’s being run or shared in the cloud – we can discover those shadow IT networks and help IT get a handle on them,” Gandhi said.


The Design Thinking Process: Five Stages to Solving Business Problems

5 Stages in the Design Thinking Process
There are lots of ways to harness ideas and solve problems. Design thinking is one means to foster and refine creative problem-solving. While it doesn’t suggest ignoring your data, design thinking is, at its core, human-centered. It encourages organizations to focus on the people they’re creating for in hopes of producing better products, services and internal processes. ... The best way to put design thinking into use in your organization is by creating a strategic planning approach that takes ideas from assessment to analysis to delivery. By employing an iterative approach with a thorough assessment and a feedback loop, everyone in your organization will feel more empowered and engaged. The reality of business today is that nearly every business problem is going to have a technological solution. It will fall to the IT organization to take the ideas that come out of your design thinking and figure out how to deliver them as solutions at scale and speed. This is where enterprise architecture comes into play. Evaluating, planning and deploying a business solution will require visibility. How will these solutions impact users? Can they be supported by the existing IT infrastructure? How do they fit into the business ecosystem?



A new look at the Cybersecurity Skills Market

The survey found that most businesses used hardware and software regarded as obsolete by suppliers. Few had any full-time in-house IT support staff and most had received no professional training. More-over none of the publicly funded training programmes in the TEC portfolio were felt to be relevant to their needs. Those wanting skilled staff were happy to train their own, provided the TEC would help them identify recruits with the necessary aptitude and attitude. They would also have liked the TEC to create a list of reputable local organisations providing relevant modular short courses. The results were so far out of line with “accepted wisdom” that the implications, beyond the synopsis headline “The users have taken over the system”, were ignored. My draft report and recommendations were never published. ... Almost none will know the training their staff might need. Few will know how to find a reputable supplier of security services who can met their needs at affordable cost.


Can organizations embrace the digital native mindset?

Digital
It is not the lack of ideas but a dearth of inspiration for most of the traditional companies in our country that keeps them from taking the digital route. Despite huge sums being invested in research and development, most of the organizations remain unwitnessed to the endless possibilities that digital intervention can bring to their businesses. Although, they seed research practices and incubators point towards this direction, conventional practices prevent these companies from taking the unexplored route. In a recent McKinsey survey, many respondents said that parent companies had hindered the development of their start-ups and limited entrepreneurs’ freedom to make decisions. Scaling up the individual businesses under one umbrella becomes a challenge for many large conglomerates. The transformation needs to start at an operational level with constant awareness to all the stakeholders involved about the change in practices. The key is to include digital intervention in organizational planning to establish the idea across all levels. Later, the plan should be measured against the outcome to promote digitization in good faith.


How to become a cybersecurity RSO

secure system / network security policy management
Given that cybersecurity is regulated, what practices can we adopt from the experiences of HRO’s regarding compliance and regulation? The distinction between goal focused regulation and error-focused regulation is an important concept. Most compliance regimes focus on the former; i.e. meeting control objectives. However, organizations may benefit from enhancing their internal error-detection capabilities. Another applicable point relates to extended organizations. In many cases managing the regulatory and reliability implications of the organization’s supply chain may be the biggest risk faced by the organization. Managing for security has recently become a science. CISO’s now present to the board and know not to be the department of “no” and to support business initiatives. But HRO’s and RSO’s have been managing to high reliability for decades. The “three lenses” view of organizations leads to three parallel paths toward building a cybersecurity HRO. If you fail to see through all three lenses, you will likely not achieve your goals.


Simulation software: protecting your organisation during a sustained period of cyber war

Simulation software: protecting your organisation during a sustained period of cyber war image
We’re in the midst of a cyber war that threatens every single business and the vast majority of individuals. Simulation software may provide a solution, but first, the problem. The problem is not going way, but is in fact, getting worse — technology, an increasing presence in everyone’s lives, connects the operations of pretty much everything. And so, cyber attacks are more pervasive than ever before — the majority own a smartphone. In the cyber war, malicious cyber attacks are increasing in prevalence and sophistication, and the targets chosen are continually widespread. “Businesses, from national infrastructure and network carriers through to businesses of all sectors and sizes are in the sights of cybercriminals,” confirms Martin Rudd, CTO, Telesoft Technologies. “Think of the nationwide damage that would happen if the organisations powering this critical national infrastructure were to suffer a targeted cyber attack. These organisations generate and store colossal amounts of personal data, making them extremely valuable to cyber criminals — but also, thanks to their gargantuan size, difficult to breach.


Creating a culture of learning

Many managers still rise in organizations because they are good at process optimization, driving out waste and inefficiency, and managing for scale. But when you become CEO, your priorities change. You discover more complex issues that can’t be solved through process optimization. Suddenly you spend your time thinking about why the things that ought to be getting better aren’t getting better, and why they may be getting worse. For example, many CEOs are trying to improve diversity in their organizations, and to create less hostile environments. Yet no matter how hard they try to shift the rules — “Let’s set targets, hold people accountable, and engineer our way toward a more diverse workforce” — they’re not succeeding. Workplace behaviors still sometimes get toxic, even though nobody wants them to. To understand why this happens, you have to think in less linear ways. For example, instead of trying to lock everyone into one diversity-and-inclusion process, what if you let employees choose their approach, and talked about that choice openly? You need to develop a culture of learning if you want to be innovative.


Even fintech startups battling to meet cyber security challenges


The research into fintechs shows that eight main websites and 64 subdomains have at least one publicly disclosed and exploitable security vulnerability of a medium or high risk, compared with seven in the banking sector. The most common website vulnerabilities are cross-site scripting (XSS), sensitive sata exposure, and security misconfiguration, despite all of them featuring in the Owasp top 10 application vulnerabilities, which are well-known and have well-established mitigation methods. All of the mobile applications tested contained at least one security vulnerability of a medium risk, while 97% have at least two medium or high-risk vulnerabilities. The tests show that 56% of mobile app backends have serious misconfigurations or privacy issues related to SSL/TLS configuration and insufficient web server security hardening. The report reveals that 62% of the fintechs’ main websites failed payment card industry data security standard (PCI DSS) compliance test.


Moscow's blockchain voting system cracked a month before election

The French academic was able to test Moscow's upcoming blockchain-based voting system because officials published its source code on GitHub in July, and asked security researchers to take their best shots. Following Gaudry's discovery, the Moscow Department of Information Technology promised to fix the reported issue -- the use of a weak private key. "We absolutely agree that 256x3 private key length is not secure enough," a spokesperson said in an online response. "This implementation was used only in a trial period. In few days the key's length will be changed to 1024." Gaudry, who discovered that Moscow officials modified the ElGamal encryption scheme to use three weaker private keys instead of one, couldn't explain why the IT department chose this route. "This is a mystery," the French researcher said. "The only possible explanation we can think of is that the designers thought this would compensate for the too small key sizes of the primes involved. But 3 primes of 256 bits are really not the same as one prime of 768 bits." 



Quote for the day:


"A leadership disposition guides you to take the path of most resistance and turn it into the path of least resistance." -- Dov Seidman


Daily Tech Digest - August 21, 2019

Handing Over the (Digital) Keys: Should You Trust a Smart Lock?


Inherent security flaws that lead to hacks aren’t the only avenue third parties can use to eye your data. Sometimes, it hits a little closer to home. If you have access to the app that controls a smart lock, you can probably see someone leaves and enters for the day, which can be beneficial in knowing your significant other made it home safely. But it could also inform someone of your whereabouts. Technically, if you don’t own the lock, the owner might be able to see your information, too “If a lock is connected to the internet, then there is always the danger that it could be hacked,” Ray Walsh, digital privacy expert for ProPrivacy.com, said in an email to Reviews.com. “Of course, an internet-connected smart lock may be able to feed its owner additional information – such as an alert when someone unlocks it. This data certainly has its merits, but may only be so useful in the end,” Walsh said. For example, although the privacy policy has since changed, Gizmodo found that smart lock company Latch stated GPS information could be stored and shared with owners and any subsequent owners in an archived link from May 8th.


Don’t get woken up for something a computer can do for you; computers will do it better anyway. The best thing to come our way in terms of automation is all the cloud tooling and approaches we now have. Whether you love serverless or containers, both give you a scale of automation that we previously would have to hand roll. Kubernetes monitors the health checks of your services and restarts on demand; it will also move your services when "compute" becomes unavailable. Serverless will retry requests and hook in seamlessly to your cloud provider’s alerting system. These platforms have come a long way, but they are still only as good as the applications we write. We need to code with an understanding of how they will be run, and how they can be automatically recovered. ... There are also techniques for dealing with situations when an outage is greater than one service, or if the scale of the outage is not yet known. One such technique is to have your platform running in more than one region, so if you see issues in one region, then you can failover to another region.


In July, Reuters reported that as part of an effort to combat money laundering, Japan’s government is “leading a global push” to set up for cryptocurrency exchanges a system like SWIFT, the international messaging protocol that banks use for bank-to-bank payments. Last week, a report from Nikkei suggested that 15 governments are planning to create a system for collecting and sharing personal data on cryptocurrency users.  But several people familiar with the FATF-led international discussions around cryptocurrency regulation told MIT Technology Review that these reports don’t have it quite right. There doesn’t appear to be a government-led global cryptocurrency surveillance system in the works—at least not yet. And it’s likely that whatever does eventually emerge won’t look much like SWIFT. Exchanges are still early in the process of figuring out what systems and technologies to use to securely handle sensitive data, Spiro says, and how to do it in a way that complies with a range of local privacy rules. “There are a lot of balls in the air,” he says.


Security concerns blocking UK digital transformation


“Protection and prevention are still paramount yet, to stay ahead of these evolving trends, organisations need to start thinking differently about cyber security. Business leaders need to make the leap from seeing cyber security as only a protective measure, to it also being a strategic value driver,” he said. The report also shows that across many organisations, chief information officers (CIOs) and wider board member views around cyber security are not yet aligned. Business leaders such as the CEO, CFO and COO tend to be less confident about their organisation’s cyber security than those with direct responsibility for IT and technology such as the CIO and chief information security officer (CISO). In addition, technology leaders are more likely to believe it is important for competitive advantage to have a cyber-secure brand (82%), compared with only 68% of business leaders.


Use of Facial Recognition Stirs Controversy

Use of Facial Recognition Stirs Controversy
Over the past several years, the use of facial recognition - along with other technologies such as machine learning, artificial intelligence and big data - has stoked global invasion of privacy fears. In the U.S., the American Civil Liberties Union has taken aim at Amazon's Rekognition product, which uses a number of technologies to enable its users to rapidly run searches against facial databases. The ACLU's Nicole Ozer last year called for guarding against supercharged surveillance before it's used to track protesters, target immigrants and spy on entire neighborhoods. More recently, city officials in San Francisco and Oakland have banned police from using facial recognition technology. The debate over facial recognition technology has also been addressed by several U.S. presidential candidates. On Monday, Democratic hopeful Bernie Sanders became the first presidential candidate to call for a ban on the use of facial recognition by law enforcement. This is one part of a larger criminal justice reform package that the Vermont senator's campaign calls "Justice and Safety for All."


Extreme Programming in Agile – A Practical Guide for Project Managers

extreme programming aquarium example
The XP lifecycle can be explained concerning the Weekly Cycle and Quarterly Cycle. To begin with, the customer defines the set of stories. The team estimates the size of each story, which along with relative benefit as estimated by the customer, indicate the relative value used to prioritize the stories. In case, some stories cannot be estimated by the team due to unclear technical considerations involved, they can introduce a Spike. Spikes are referred to as short, time frames for research and may occur before regular iterations start or along with ongoing iterations. Next comes the release plan: The release plan covers the stories that will be delivered in a particular quarter or release. At this point, the weekly cycles begin. The start of each weekly cycle involves the team and the customer meeting up to decide the set of stories to be realized that week. Those stories are then broken into tasks to be completed within that week. The weekends with a review of the progress to date between the team and the customer. This leads to the decision if the project should continue or if sufficient value has been delivered.


Breakthroughs bring a quantum Internet closer

3 nodes and wires servers hardware
The TUM quantum-electronics breakthrough is just one announced in the last few weeks. Scientists at Osaka University say they’ve figured a way to get information that’s encoded in a laser-beam to translate to a spin state of an electron in a quantum dot. They explain, in their release, that they solve an issue where entangled states can be extremely fragile, in other words, petering out and not lasting for the required length of transmission. Roughly, they explain that their invention allows electron spins in distant, terminus computers to interact better with the quantum-data-carrying light signals. “The achievement represents a major step towards a ‘quantum internet,’ the university says. “There are those who think all computers, and other electronics, will eventually be run on light and forms of photons, and that we will see a shift to all-light,” I wrote earlier this year. That movement is not slowing. Unrelated to the aforementioned quantum-based light developments, we’re also seeing a light-based thrust that can be used in regular electronics too. Engineers may soon be designing with small photon diodes that would allow light to flow in one direction only, says Stanford University in a press release.


Automated machine learning or AutoML explained

Automated machine learning or AutoML explained
Automated machine learning, or AutoML, aims to reduce or eliminate the need for skilled data scientists to build machine learning and deep learning models. Instead, an AutoML system allows you to provide the labeled training data as input and receive an optimized model as output. There are several ways of going about this. One approach is for the software to simply train every kind of model on the data and pick the one that works best. A refinement of this would be for it to build one or more ensemble models that combine the other models, which sometimes (but not always) gives better results. A second technique is to optimize the hyperparameters of the best model or models to train an even better model. Feature engineering is a valuable addition to any model training. One way of de-skilling deep learning is to use transfer learning, essentially customizing a well-trained general model for specific data. Transfer learning is sometimes called custom machine learning, and sometimes called AutoML (mostly by Google). Rather than starting from scratch when training models from your data, Google Cloud AutoMLimplements automatic deep transfer learning and neural architecture search for language pair translation, natural language classification, and image classification.


Considerations for choosing enterprise mobility tools


One option is to use an open source enterprise mobility management (EMM) platform. If the organization is willing to invest in the resources, open source EMM offers the flexibility to customize and extend the source code to match specific needs. IT pros should be aware of challenges that can come with maintaining their own open source EMM, such as hidden costs of deployment and lack of support. A few options for open source EMM include WSO2 Enterprise Mobility Manager or Teclib's Flyve MDM. WSO2's offering includes enterprise mobility tools such as mobile application management and mobile identity management. It also includes open source support for IoT devices, such as enrollment and application management, through IoT Server. Organizations looking for more established enterprise mobility tools can look to UEM platforms including Citrix Workspace, VMware Workspace One, IBM MaaS360, BlackBerry Unified Endpoint Manager, MobileIron UEM or Microsoft Enterprise Mobility + Security, which includes Intune.


The Future Enterprise Architect


Archie II understands the needs of decision makers throughout the organization including the need to provide timely, if not on-demand, decision support based on solid information and analysis. Archie II also understands that he must not only support the decision making processes in the organization, but also to enable those decisions by providing guidance. Archie II is proactive and is often ready with answers before questions arrive. Archie II uses or adapts existing architectures, and/or creates new architectural patterns and models to support analysis he performs in order to make recommendations needed as value chains or value streams progress. Archie II collects just enough information, resulting in just enough architecture, to support the decisions at hand that match the cadence of the business. Yet Archie II is continuously listening, evolving and analyzing his models of the enterprise as new information becomes available. He proactively connects with those necessary, when necessary. His calls are always returned as he has the reputation of “when Archie II speaks, we need to listen!”



Quote for the day:


"Your greatest area of leadership often comes out of your greatest area of pain and weakness." -- Wayde Goodall


Daily Tech Digest - August 20, 2019

Blockchain is not a magic bullet for security. Can it be trusted?

Bitcoin's vulnerabilities have already been successfully exploited in significant hacks.
As with any technology, security issues arise when developers program requirements into products and services. The lines of code, consensus mechanisms, communication protocols, etc., all have the potential to host vulnerabilities that can be exploited for malicious use. But blockchain at the moment remains a divergent technology: multiple protocols and programming languages are being developed in parallel. As a result, it is difficult for developers to acquire the experience needed to secure their code, while most are under stringent time pressure to deliver. Because blockchain relies heavily on cryptography, the practice of secure communication, it gives many the impression that it’s a self-secured technology. This could not be further from the truth, as blockchains are built on top of communication networks and equipment that need to be secured. Traditional information security challenges apply to blockchain, too. Furthermore, cryptography is, like any other security discipline, a changing field: quantum computers are already expected to break a number of cryptographic algorithms.



How to unlock the true value of data


Central to a hub architecture will be the technologies used to get data flowing into it from applications and other data sources, and then provisioning outward to consumers – internal just as much as external. These might include extract, transform and load (ETL) tools that support bulk or batch movement of data, data replication and data virtualisation. They can also include app-integration middleware, such as the enterprise service bus, and message-oriented technologies that move data around in the form of message constructs. Whatever tools are used, on-premise and cloud service versions are available to tap, and there are still other elements to consider, such as governance tools to help with data compliance and metadata management tools to tag and manage data flows better. One of the big headaches for those tasked with developing a business’s data architecture is control.


Texas Pummeled by Coordinated Ransomware Attack

Texas Pummeled by Coordinated Ransomware Attack
In an updated statement released Saturday, DIR said the total victim count stood at 23 organizations. The Texas Military Department as well as Texas A&M University System's Cyberresponse and Security Operations Center teams "are deploying resources to the most critically impacted jurisdictions," it added. The U.S. Department of Homeland Security as well as FBI's cyber division, among others, have also been assisting with the response. "At this time, the evidence gathered indicates the attacks came from one single threat actor. Investigations into the origin of this attack are ongoing; however, response and recovery are the priority at this time," DIR said. "It appears all entities that were actually or potentially impacted have been identified and notified." Systems and networks run by the state of Texas have not been disrupted, DIR says. Officials in Austin said their systems were unaffected by the attack. "We are monitoring the situation," Bryce Bencivengo, a spokesman for Austin's Office of Homeland Security and Emergency Management, told local NPR member station KUT.


Value Engineering: The Secret Sauce for Data Science Success

If what your organization seeks is to exploit the potential of data science to power your business models; then the Data Science Value Engineering Framework provides the “How” the organization can do it. The Value Engineering Framework starts with the identification of a key business initiative that not only determines the sources of value, but also provides the framework for a laser-focus on delivering business value and relevance. A diverse set of stakeholders is beneficial because it provides more perspectives on the key decisions upon which the data science effort needs to focus. The heart of the Data Science Value Engineering Framework is the collaboration with the different stakeholders to identify, validate, value and prioritize the key decisions (use cases) that they need to make in support of the targeted business initiative. After gaining a thorough understanding of the top priority decisions (use cases) the analytics, data, architecture and technology conversations now have a frame within which to work (by understanding what’s important AND what’s not important).


People will usually follow those who have the most positional authority and concomitant control of resources, but also will follow those with other forms of power, such as eloquence, passion, sincerity, commitment, and charisma. In the teams I work with, people tend to pay the most attention to and be most influenced by those with both types of power. But sometimes people will even choose to follow less senior individuals if they have inspiring ideas and energy. No matter how powerful they are, though, when people show themselves to be untrustworthy, through something they do inside or outside the team, their influence vanishes. Others might still pay attention to them, but now only for transactional purposes. Those “leaders” are no longer really leading. If you want to be a real leader, one with voluntary followers, remember that you must earn and keep your people’s trust. They will carefully assess your attitude and actions, in particular whether you look out for others in addition to yourself. If their assessment is that you are trustworthy, they’ll stick with you.


These robot snakes designed by AI could be the next big thing in the operating theatre


In order to create snakebots that work in the confines of each individual's anatomy, the QUT team generate tens of virtual versions of the snakebot and set evolutionary algorithms to work on them, in a survival-of-the-fittest contest designed to create the best available bot.  First, the patient's knee is scanned by CT or MRI and a model of its internal anatomy is modelled. Alongside the surgeon, the QUT system then delineates which parts of the knee the surgeon will need to reach during the operation, as well as the parts they need to avoid. ... Afterwards, they're ranked according to their performance. Then the evolutionary algorithm refines the better-performing designs according to the results of the simulation, running the simulation over and over again, and tweaking the winning bots and rejecting those that aren't up to scratch. "This is copying what's been observed in nature and the process of evolution but you do it inside the computer... we kill off the ones that didn't do very well and we mate the ones that do well. Mating, in this case, means you combine half the characteristics of one and half of the other. You do random mutations on some of them - change a few little bits during the mating."


What Cybersecurity Trends Should We Expect From The Rest Of 2019?

uncaptioned
Currently, the industry standard for security relies on two-factor authentication when users choose to log into the software. While many email services and social media sites only ask for one form of authentication, two-factor authentication is the future. However, by the time companies adopt this, multifactor will have taken off. Most data breaches are caused by leveraging bad passwords. Weak, stolen or default passwords are usually the biggest culprits for a data leak. Single authentication allows this to happen since passwords can be limited to just something you know. By giving out a dongle or integrating an app with temporary passwords that expire, you can ensure that only verified users get access. Since more people than ever are worried about stolen identities, we should see this kind of authentication process take off in coming years. ... Companies are even finding ways to deceive potential hackers. By imitating your company's more critical data and assets, this bait can act as a trap for anyone trying to get ahold of your data.


Case Study: Improving ID and Access Management

A few years ago, Molina Healthcare was using a homegrown solution to onboard and offboard users daily in batches from the company's HR system into Active Directory, she says. But the company was growing quickly, so the mostly manual process of provisioning and de-provisioning access to Molina's systems was time-consuming, Sankepally says in an interview with Information Security Media Group. "With the increasing demands, we couldn't complete all the business processes involved, and there was a lack of standards," she says. "Our onboarding process was taking 10 to 20 days." As a result, the company made a move to standardize and automate its ID and access management platform, choosing to implement technology from SailPoint Technologies, she says. "Today we have more than 15,000 active identities supporting 15 different states with different lines of business ... including caregivers on the ground." For onboarding users, the company now has a "near real-time integration" with its cloud-based HR system that has automated the onboarding and offboarding process, she says.


GDPR faces growing pains across Europe


European countries have clearly demonstrated different strategies on penalties. Also, they have set up different structures for implementing the regulations. In Germany, for example, DPAs are organised on a German state level – but there is also a separate DPA at federal level, with jurisdiction over telecom and postal service companies. The result is that Germany has 17 data protection authorities, instead of just one. Another area where European countries disagree is in their interpretations of some of the finer points of GDPR. For example, Austria’s DPA ruled that all a data controller has to do in response to a request for data deletion is to remove individual references to that data. Nations have also demonstrated differences of opinion on how to calculate fines. For example, some local legal authorities in Germany have argued that the GDPR fines imposed in that country should be calculated according to German law, which would result in much lower fines than those imposed at the European level.


Visa Adds New Fraud Disruption Measures

Visa now is adding fraud disruption to supplement its transaction fraud detection and remediation efforts. The company today at the Visa US Security Summit 2019 in San Francisco outlined five new capabilities it now uses to prevent fraudulent transactions. "We're looking to identify and disrupt fraud before it happens," says David Capezza, senior director of payment fraud disruption at Visa. "We want to take a more proactive approach and identify these attacks and shut them down before they occur." Rivka Gewirtz Little, research director for global payment strategies at IDC, says Visa's new approach blends both its cyber and fraud units. "Typically, organizations are focused on the transaction," Gewirtz Little says. "What's interesting here is that Visa is creating a true cyber fraud system where the cyber team and fraud teams are integrated: the cyber team focuses on the attack against the enterprise and the fraud team looks at ways of preventing the attack. It's not always the same set of tools, the same team and objectives."



Quote for the day:


"Leadership offers an opportunity to make a difference in someone's life, no matter what the project." -- Bill Owens


Daily Tech Digest - August 19, 2019

Center for Data Innovation: U.S. leads AI race


“AI is the next wave of innovation, and overlooking this opportunity will pose a threat to a region’s economic and national security,” said Center for Data Innovation director Daniel Castro in a statement. “The EU has a strong talent pool and an active research community, but unless it supercharges its current AI initiatives, it will not keep pace with China and the United States.” The center chose to focus on six categories: talent, research, development, adoption, data, and hardware. Based on a 100-point scale, researchers found that the U.S. led overall with 44.2 points, China was second at 32.3, and the European Union placed third with 23.5. The study found that the U.S. shows clear leadership in four of the six categories: talent, research, development, and hardware. China leads in adoption and data. The findings would appear worrisome for the EU, which has placed great emphasis on its AI efforts in recent years. But the region does place second in four categories: talent, research, development, and adoption. Of those, it is particularly strong in the research category.



Modern Technology, Modern Mistakes

"It is common for attackers to find common utilities such as FTP clients or video conversion software, package or wrap malicious code into the installer, and then upload their packed installer to a free software download site, knowing that users may find their malicious version of the software installer before they find the legitimate original," Murphy says. Additionally, employees are increasingly feeling entitled to work from anywhere and have access to anything at any time, according to Nick Bennett, director of professional services at FireEye Mandiant. "Employees [also] feel entitled to use work assets for non-work activities, and they are bypassing protections that are in place, making themselves more susceptible to phishing attacks," Bennett says. The issue is twofold. Employees are using corporate-issued workstations for personal use, even if they are at home. When they bring that workstation back to the enterprise, they are also putting the business at risk, Bennett explains. In addition, "employees are also using non-corporate assets to access the corporate network on a device that is unmanaged by enterprise," Bennett says.


The Danger of Over-Valuing Machine Learning

Finger pressing power button with energy resources icon on earth at night background. Earth day. Environment and conservation. Energy saving concept. Elements of this image furnished by NASA
While machine learning is a powerful tool, it is not a magical box. Bias in the data or the model can train machine learning systems in inappropriate ways. Strange butterflies may be lurking within, bringing up artifacts that may be due to factors that have nothing to do with the data. The lack of transparency in the process can make determining what exactly a model returns problematic, because emergent behaviors may be lurking in the background that are often difficult to ferret out. Finally, the deeper the learning, the more energy is required to maintain multiple levels of abstraction, and that energy can often be significant enough to make using such systems uneconomical. This is not to say that such tools are useless - most of the evolution of machine learning systems in the last decade have proven highly useful and effective for a wide variety of applications, and form an integral part of the artificial intelligence toolkit. The danger comes in thinking that such systems are truly intelligent, rather than simply the clever application of high speed, and occasionally non-linear solutions.


How Can The Insurance Industry Face Its Challenges?

Technology – as in all industries – has an important role to play in the transformation of insurance. New opportunities are being created by the likes of the IoT, telematics, advanced data analytics, and technologies which support consumption-based offerings and insurance for the sharing economy. These technological advantages are particularly welcome in a sector which has faced increasing regulation. As Paton notes, many modern regulatory challenges began with the EU’s Solvency II Directive in 2009, and other decrees have followed. “The big impact of Solvency II from an innovation perspective was that it diverted the insurer’s ability to invest in new areas,” he says. “It was followed by the Insurance Distribution Directive and a regulatory scrutiny on conduct in areas such as personal lines. So now with things like point of renewal, it’s necessary to make people aware of how much their insurance premium has gone up.” “There will be more initiatives of this type,” Paton adds, “right through to changes in expectations around operational resilience, cybersecurity and third party supply management.”


Humans are the weak link: Security awareness & education still a challenge for UK companies

CSO > Weak/broken link
Although moving away from blame culture and the idea that people are the problem should be a goal of today’s security teams, most organizations still see employees as a chink in company defenses. A massive 98 percent of respondents agreed with the statement that: ‘The human employee is the weakest link when it comes to cybersecurity’. Over two-thirds agreed with this idea strongly. Social engineering, phishing, and business email compromise – all attacks which rely on people falling prey to manipulation and trickery – were listed amongst the top threats organizations are most concerned about. This dim view of the role people play in security likely feeds into why only 13 percent of organizations would rate internal cybersecurity awareness as very good. Also, 40 percent of respondents said awareness was merely adequate, suggesting there is still much work to be done around improving education, raising awareness and reducing people-based risks as a result. While organizations may still view humans as the problem, security teams also recognize that people-based problems require people-based solutions. 85 percent of the companies surveyed stated they were utilizing awareness training to reduce human error.


How to Use Data to Improve Your Sprint Retrospectives

Most agile teams do sprint retrospectives at least once a month, to iterate and improve on their software development process and workflow. However, a lot of those same teams rely only on their feelings to “know” if they have actually improved. But you need an unbiased reference system if you want to compare how two sprints went. ... Commit frequency and active days serve the same purposes. An active day is a day in which an engineer contributed code to the project, which includes specific tasks such as writing and reviewing code. Those two alternative metrics are interesting if you want to introduce a best practice to commit every day. It’s also a great way to see the hidden costs of interruptions. Non-coding tasks such as planning, meetings, and chasing down specs are inevitable. Teams often lose at least one day each week to these activities. Monitoring the commit frequency enables you to see which meetings have an impact on your team’s ability to push code. It’s important to keep in mind that pushing code is actually the primary way your team provides value to your company.


Blockchain: a friend to digital continuity and lightweight workflow tool

Blockchain: a friend to digital continuity and lightweight workflow tool image
While even the experts can’t predict when widespread adoption of the technology will place, they can suggest what it will take before widespread adoption is possible. The most important lesson is understanding: understanding that blockchain technology should work as part of an ecosystem of technologies. It doesn’t matter what industry you’re in, no matter what, it starts with customer experience. And, the CX depends on a choreography of technology. In terms of display, there has to be a web and touch experience, or an AI component with chatbots, for example. Similarly, with blockchain, it’s not just one thing that’s going to impact the use case. “One obvious pitfall of blockchain, is that people look at it as the only solution for realising anything and everything. But, this is wrong. You have to really understand the end-to-end experience and see where blockchain technology fits in,” said Jitendra Thethi, AVP, Altran.


The Digital Leader's Guide to Lean and Agile

This focus on needs and outcomes turns out to be a great way to integrate the key frameworks and models of the Lean-Agile landscape. This is a worthwhile goal; it’s worth noting that Agile is often introduced not right-to-left, but left-to-right – not needs-first, but backlog-first or solution-first. When the focus is on ploughing through backlogs of requirements, the likely result is mediocrity (or worse), hardly a great advertisement for Agile. And the dissonance of imposing solutions on teams rather than seeking to meet their and the wider organisation’s needs is potentially fatal to Agile adoptions. Right to Left should not be understood as an attack on branded process frameworks; neither does it elevate any one framework over the others. However, as well as calling into question how they are often rolled out, I do voice the regret that they are so often described in left-to-right terms, leading me to wonder how Agile is then supposed to be understood as a departure from 20th century ways of thinking and working. I demonstrate that Scrum and even SAFe are readily described in right-to-left terms – "iterated self organisation around goals" is the five-word summary


Privacy beyond HIPAA in voice technology


“When it comes to healthcare and voice design, we have several challenges we face every day,” Freddie Feldman, voice design director at Wolters Kluwer Health, said at The Voice of Healthcare Summit at Harvard Medical School last week. “HIPAA is a big topic on everyone’s mind nowadays, and it is one we take seriously. The first thing most people think about when they hear HIPAA is securing servers platforms, but there is more to it. We have to consider things like the unintended audience for a call.” He said that due to the nature of voice, even leaks not expressly prohibited by HIPAA can be inappropriate. For example, if the voice technology is intended for home use and gives a message from the radiology department to the house, then it’s giving away too much information, he said.  Much of it comes down to appropriate use. For example, putting the speakers into a hospital room setting poses a different set of challenges.  “I think as far as smart speakers and virtual assistants [go], Amazon right now only has HIPAA-eligible environments, so basically turning on and off HIPAA for specific skills, enabling HIPAA for a particular voice app or voice skill,”


Artificial Intelligence Needs a Strong Data Foundation

The largest and most basic need in the data science hierarchy is the need for data collection. While every bank and credit union collects data daily on transactions, product use, customer demographics, and even external insights from social media and other sources, an organization needs to determine what specific insight may be needed to get a complete picture. Are you collecting insight on channel use, geolocational data and consumer beliefs and behaviors? While you can build a plan for future collection, the success of any machine learning or AI initiative hinges on the scope and quality of data collected. As important as the collection of the right data is important, Rogati stresses that it is equally important to have an ongoing flow of real-time data that is easy to access, store and analyze. This can be a major challenge for financial services organizations that are notorious for having data silos. Beyond internal data flows, it is important that any external or unstructured data can also be collected, stored and analyzed. While once a major problem, cloud technology has simplified some of the storage challenges.



Quote for the day:


"To be a good leader, you don't have to know what you're doing; you just have to act like you know what you're doing." -- Jordan Carl Curtis


Daily Tech Digest - August 18, 2019

Realities and myths for 5G’s impact on logistics

Realities and myths for 5G’s impact on logistics image
The disruptive potential of 5G in logistics is all about the Internet of Things. We already see 4G and Wi-Fi networks as the ‘connective tissue’ between every device we connect to the internet, including computers, phones, wearables, home appliances and major business infrastructure. Every business relies on data to function, and logistics companies handle even more data than most. The sheer confluence of various employee functions, delivery vehicles, material handling equipment and facility control systems has always required lightning-fast connections with low latency and high uptime. 5G can deliver on that promise once it’s up and running. Individual devices will be able to achieve their own internet connections, provided they bring their own power or have access to it. Because of the far lower latency than 4G — up to 10 times lower — companies will be able to distribute and exchange far larger quantities of data than ever.



Cloud security is too important to leave to cloud providers

The need to take control of security and not turn ultimate responsibility over to cloud providers is taking hold among many enterprises, an industry survey suggests. The Cloud Security Alliance, which released its survey of 241 industry experts, identified an "Egregious 11" cloud security issues.  The survey's authors point out that many of this year's most pressing issues put the onus of security on end user companies, versus relying on service providers. "We noticed a drop in ranking of traditional cloud security issues under the responsibility of cloud service providers. Concerns such as denial of service, shared technology vulnerabilities, and CSP data loss and system vulnerabilities -- which all featured in the previous 'Treacherous 12' -- were now rated so low they have been excluded in this report. These omissions suggest that traditional security issues under the responsibility of the CSP seem to be less of a concern. Instead, we're seeing more of a need to address security issues that are situated higher up the technology stack that are the result of senior management decisions."



Asbeck estimates that it will only take a few years before labor industries adopt their use, and it won’t be long before passive suits become an affordable and commonplace use by the able-bodied. Exoskeletons could add years of enjoying an active lifestyle, like hiking, for the elderly by providing greater endurance. ... “A lot of us wear a Bluetooth device in our ear now, and that was tech you saw the military wearing in a movie fifteen years ago,” Haas said, “It looks crazy and futuristic but now we all see that type of technology.” If wearable robotics make you think less of Iron Man and more of Wall-E — where futuristic humans rely on robotic lounge chairs for mobility, rendering them too bloated to walk — David Perry, an engineer at Harvard Biodesign Lab, doesn’t show concern. He says these suits will maintain the health of people who do incredible physical feats. His leg exoskeleton, which soldiers tested, won't alleviate all the stress on the wearer’s muscles. But it does make it easier to walk by about 15%, even while carrying a heavy load.



Managing compliance costs with quality data

Quality data should allow compliance teams to screen against a range of datasets, including sanctions, politically exposed persons (PEPs), law enforcement lists, regulatory lists, adverse media records geared towards sanctions compliance, AML, countering the financing of terrorism, and countering proliferation finance. Data should also be comprehensive; de-duplicated; consistent; accurate; configurable; and up-to-date. Given this extensive list of requirements, when choosing a data provider, it is important to choose one that has global capabilities and is able to deliver reliable and trusted data.The dataset being considered should have a highly analytical set of inclusion criteria and be de-duplicated. A successful provider should allow extensive ‘slicing and dicing’ and configuration and, perhaps most importantly of all, must take the control of operational costs seriously. It is worth mentioning that overscreening – in other words screening beyond regulatory requirements – can significantly contribute to excessive operational costs.


Can private online communities drive corporate cultural change?

speech balloons speech bubbles conversation talk social media network by comicsans getty
By creating a dedicated interactive workspace for your customers, partners and company staff, you create a collaborative space built on trust. Potential customers look for – and many times rely on – current customer interaction for recommendations and support. Through customer engagement you can educate your customers on new product enhancements and product rollouts. Public online communities are typically full of distracting noise, or communication is one-way (through reviews or isolated through FAQ’s or chatbots between the customer and the brand). Although subcultures formed by joining groups can limit noise, building brand loyalty amongst the distracting noise of public online communities is still not as safe or effective. The data can’t be gathered to increase a company’s value and effectiveness or enhance product development. In a dedicated, private online community, interactive real-time networking and tribal problem-solving helps to create long-term partnerships and friendships.


Major breach found in biometrics system used by banks, UK police and defence firms

Facial recognition technology on woman
The researchers said the sheer scale of the breach was alarming because the service is in 1.5m locations across the world and because, unlike passwords being leaked, when fingerprints are leaked, you can’t change your fingerprint. “Instead of saving a hash of the fingerprint (that can’t be reverse-engineered) they are saving people’s actual fingerprints that can be copied for malicious purposes,” the researchers said in the paper. The researchers made multiple attempts to contact Suprema before taking the paper to the Guardian late last week. Early Wednesday morning (Australian time) the vulnerability was closed, but they still have not heard back from the security firm. Suprema’s head of marketing, Andy Ahn, told the Guardian the company had taken an “in-depth evaluation” of the information provided by vpnmentor and would inform customers if there was a threat. “If there has been any definite threat on our products and/or services, we will take immediate actions and make appropriate announcements to protect our customers’ valuable businesses and assets,” Ahn said.


Here’s How Artificial Intelligence Is Fueling Climate Change

AI Apocalypse
At the moment, data centers—the enormous rooms full of stacks and stacks of servers that juggle dank memes, fire tweets, your vitally important Google docs and all the other data that is stored somewhere other than on your phone and in your home computer—use about 2% of the world’s electricity. ... According to The MIT Technology Review, Dickerson recently told a conference audience in San Francisco that—unless super-efficient semiconductors are innovated in the next five years—data centers handling AI demands could account for 10% of the world’s electricity use by 2025, a hundred-fold increase in a half-decade. Dickerson’s forecast is a worst-case scenario. Other tech execs have given estimates that vary wildly. Some think data centers, period, will suck 10% of the global electricity load. Yet others think that usage will remain relatively flat, in part because of large companies’ abilities to handle vast amounts of data in more efficient ways. Google, for example, is using AI technology to cool its data centers, reducing demand for power by 40%.


On Stocks And Machine Learning

Undoubtedly, the “cognitive biases” described by Kahneman and Tversky act on and affect the decisions of even the most experienced and famed stock analysts and portfolio managers in the world. Specifically, the “Confirmation Bias” may lead analysts to purchase stocks that are well- known, popular and “juicy”. Analysts are usually “swamped” with information and data on the companies they follow which might raise their confidential level in their analysis of these companies’ stocks. The “Anchoring Bias” will make it difficult for the analyst to sell a stock that he purchased even if he discovers that he had erred in his original analysis of this stock’s performance. The “Representational Bias” may also lead the analysts to wrong investments. The problem related to the “Representational Bias” stems from the tendency of the analyst, when investigating the history and profile of the company, to assume that these parameters will repeat themselves in the future. This assumption ignores the “reversion to the mean” phenomenon which is typical for the finance market and the economic market in general.


Data management roles: Data architect vs. data engineer, others


How do these data management roles compare? Data architects design and help implement database systems and other repositories for corporate data, Bowers said. They're also responsible for ensuring that organizations comply with internal and external regulations on data, and for evaluating and recommending new technologies, he added. Bowers described a data architect as a "know-it-all" who has to be familiar with different databases and other data management tools, as well as use cases, technology costs and limitations, and industry trends. "I had to master a ton of technologies to become a data architect," he said. A data modeler identifies business rules and entities in data sets and designs data models for databases and other systems to help reduce data redundancy and improve data integration, according to Bowers. Data modelers make less money on average than many other IT workers, but you get what you pay for, he cautioned.


Data Management No Longer an IT Issue

The next-generation data management platform needs to treat data differently. It needs to see data as a liquid core asset - not a static one -- that can be quickly ingested, stored in the most appropriate data formats and locations, and easily accessed by any analytical processing engine. The data architecture should be flexible, scalable, high-performance, integrated, and secure. But this does not mean you need to create an entirely new enterprise data platform, according to Han. "The core components are still the same - applications, middleware, database, analytics, and systems. However, when we build the new data architecture on top of the existing framework, we must be aware that there are new access points like mobile and IoT for collecting data today, which did not exist 15 years ago. There is also a huge abundance of data that comes in a variety of formats today. So, the question is, how can we integrate them all?" Oracle’s Big Data SQL, an end-to-end big data and AI platform, looks at all data in unison and integrates them to maximize its value.



Quote for the day:


"Leadership is, among other things, the ability to inflict pain and get away with it - short-term pain for long-term gain." -- George Will