Daily Tech Digest - September 20, 2019

Digitalization: Welcome to the City 4.0

Digitalization: Welcome to the City 4.0
Applied to cities, digitalization can not only improve efficiency by minimizing the waste of time and resources, but it will simultaneously improve a city’s productivity, secure growth, and drive economic activities. The Finnish capital of Helsinki is currently in the process of proving this. An early adopter of smart city technology and modeling, it launched the Helsinki 3D+ project to create a three-dimensional representation of the city using reality capture technology provided by the software company Bentley Systems for geocoordination, evaluation of options, modeling, and visualization.  The project’s aim is to improve the city’s internal services and processes and provide data for further smart city development. Upon completion, Helsinki’s 3-D city model will be shared as open data to encourage commercial and academic research and development. Thanks to the available data and analytics, the city will be able to drive its green agenda in a way that is much more focused on sustainable consumption of natural resources and a healthy environment.



How to decommission a data center

bulldozer wrecking ball deconstruct tear down decommission data center
"They need to know what they have. That’s the most basic. What equipment do you have? What apps live on what device? And what data lives on each device?” says Ralph Schwarzbach, who worked as a security and decommissioning expert with Verisign and Symantec before retiring. All that information should be in a configuration management database (CMDB), which serves as a repository for configuration data pertaining to physical and virtual IT assets. A CMDB “is a popular tool, but having the tool and processes in place to maintain data accuracy are two distinct things," Schwarzbach says. A CMDB is a necessity for asset inventory, but “any good CMDB is only as good as the data you put in it,” says Al DeRose, a senior IT director responsible for infrastructure design, implementation and management at a large media firm. “If your asset management department is very good at entering data, your CMDB is great. [In] my experience, smaller companies will do a better job of assets. Larger companies, because of the breadth of their space, aren’t so good at knowing what their assets are, but they are getting better.”


The Problem With “Cloud Native”

Digital data cloud, futuristic cloud with blockchain technology
The problem is thinking about and creating a common understanding around a change that big. Here the industry does itself no favors. For years, many people thought cloud technology was somehow part of the atmosphere itself. In reality, few things are so very physical: Big public cloud computing vendors like Amazon Web Services, Microsoft Azure, and Google Cloud each operate globe-spanning systems, with millions of computer servers connected by hundreds of thousands of miles of fiber-optic cable. Most people now know the basics of cloud computing, but understanding it remains a problem. Take a current popular term, “cloud native.” Information technologists use it to describe strategies, people, teams, and companies that “get” the cloud, and they use it for maximum utility. Others use it to describe an approach to building, deploying, and managing things in a cloud computing environment. People differ. Whether it’s referring to people or software, “cloud native” is shorthand for operating with the fullest power of the cloud.


Why You Need a Cyber Hygiene Program

cyber hygiene
Well-known campaigns and breaches either begin or are accelerated by breakdowns in the most mundane areas of security and system management. Unpatched systems, misconfigured protections, overprivileged accounts and pervasively interconnected internal networks all make the initial intrusion easier and make the lateral spread of an attack almost inevitable. I use the phrase “cyber hygiene” to describe the simple but overlooked security housekeeping that ensures visibility across the organization’s estate, that highlights latent vulnerability in unpatched systems and that encourages periodic review of network topologies and account or role permissions. These are not complex security tasks like threat hunting or forensic root cause analysis; they are simple, administrative functions that can provide value far in excess of more expensive and intrusive later-stage security investments. ... The execution of the most cyber hygiene falls squarely on the shoulders of the IT, network and support teams.


A Beginner's Guide to Microsegmentation

Image: knssr via Adobe Stock
Security experts overwhelmingly agree that visibility issues are the biggest obstacles that stand in the way of successful microsegmentation deployments. The more granular segments are broken down, the better the IT organization need to understand exactly how data flows and how systems, applications, and services communicate with one another. "You not only need to know what flows are going through your route gateways, but you also need to see down to the individual host, whether physical or virtualized," says Jarrod Stenberg, director and chief information security architect at Entrust Datacard. "You must have the infrastructure and tooling in place to get this information, or your implementation is likely to fail." This is why any successful microsegmentation needs to start with a thorough discovery and mapping process. As a part of that, organizations should either dig up or develop thorough documentation of their applications, says Stenberg, who explains that documentation will be needed to support all future microsegmentation policy decisions to ensure the app keeps working the way it is supposed to function.


Cryptoming Botnet Smominru Returns With a Vengeance

Cryptoming Botnet Smominru Returns With a Vengeance
Smominru uses a number of methods to compromise devices. For example, in addition to exploiting the EternalBlue vulnerability found in certain versions of Windows, it uses brute-force attacks against MS-SQL, Remote Desktop Protocol and Telnet, according to the Guardicore report. Once the botnet compromises the system, a PowerShell script named blueps.txt is downloaded onto the machine to run a number of operations, including downloading and executing three binary files - a worm downloader, a Trojan and a Master Boot Record (MBR) rootkit, Guardicore researchers found. Malicious payloads move through the network through the worm module. The PcShare open-source Trojan has a number of jobs, including acting as the command-and-control, capturing screenshots and stealing information, and most likely downloading a Monero cryptominer, the report notes. The group behind the botnet uses almost 20 scripts and binary payloads in its attacks. Plus, it uses various backdoors in different parts of the attack, the researchers report. Newly created users, scheduled tasks, Windows Management Instrumentation objects and services run when the system boots, Guardicore reports.


How to prevent lingering software quality issues


To build in quality, he advocates that IT undertake systematic approaches to software testing. In manufacturing, building in quality entails designing a process that helps improve the final product, while in IT that approach is about producing a higher-quality application. Yet, software quality and usability issues are, in many ways, harder to diagnose than problems in physical goods manufacturing. "In manufacturing, we can watch a product coming together and see if there's going to be interference between different parts," Gruver writes in the book. "In software, it's hard to see quality issues. The primary way that we start to see the product quality in software is with testing. Even then, it is difficult to find the source of the problem." Gruver recommends that software teams put together a repeatable deployment pipeline, which enables them to have a "stable quality signal" that informs the relevant parties as to whether the amount of variation in performance and quality between software builds is acceptable.


The arrival of 'multicloud 2.0'

The arrival of 'multicloud 2.0'
What’s helpful around the federated Kubernetes approach is that this architecture makes it easy to deal with multiple clusters running on multiple clouds. This is from using two major building blocks. First is the capability of syncing resources across clusters. As you may expect, this would be the core challenge for those deploying multicloud Kubernetes. Mechanisms within Kubernetes can automatically sync deployments on plural clusters, running on many public clouds. Second is intercluster discovery. This means the capability of automatically configuring DNS servers and load balancers with backends supporting all clusters running across many public clouds. The benefits of leveraging multicloud/federated Kubernetes include high availability, considering you can replicate active/active clusters across multiple public clouds. Thus, if one has an outage, the other can pick up the processing without missing a beat. Also, you avoid that dreaded provider lock-in. This considering that Kubernetes is the abstraction layer that’s able to remove you from the complexities and native details of each public cloud provider.


Microservices With Node.js: Scalable, Superior, and Secure Apps

Image title
Node.js is designed to build highly-scalable apps easier through non-blocking I/O and event-driven model that makes it suitable for data-centric and real-time apps. Node.js is highly suitable for real-time collaboration tools, streaming and networking apps, and data-intensive applications. Microservices, on the other hand, makes it easy for the developer to create smaller services that are scalable, independent, loosely coupled, and very suitable for complex, large enterprise applications. The nature and goal of both these concepts are identical at the core, making both suitable for each other. Together used, they can power highly-scalable applications and handle thousands of concurrent requests without slowing down the system. Microservices and Node.js have given rise to culture like DevOps where frequent and faster deliveries are of more value than the traditional long development cycle. Microservices are closely associated with container orchestration, or we can say that Microservices are managed by container platform, offering a modern way to design, develop, and deploy software.


Supply Chain Attacks: Hackers Hit IT Providers

Supply Chain Attacks: Hackers Hit IT Providers
Symantec says the group has hit at least 11 organizations, mostly in Saudi Arabia, and appears to have gained admin-level access to at least two organizations as part of its efforts to parlay hacks of IT providers into the ability to hack their many customers. In those two networks, it notes, attackers had managed to infect several hundred PCs with malware called Backdoor.Syskit. "This is an unusually large number of computers to be compromised in a targeted attack," Symantec's security researchers say in a report. "It is possible that the attackers were forced to infect many machines before finding those that were of most interest to them." Backdoor.Syskit is a Trojan, written in Delphi and .NET, that's designed to phone home to a command-and-control server and give attackers remote access to the infected system so they can push and execute additional malware on the endpoint, according to Symantec. The security firm first rolled out an anti-virus signature for the malware on Aug. 21. Symantec says attackers have in some cases also used PowerShell backdoors - also known as a living off the land attack, since it's tough to spot attackers' use of legitimate tools.



Quote for the day:


"A culture of discipline is not a principle of business; it is a principle of greatness." -- Jim Collins


Daily Tech Digest - September 19, 2019

Space internet service closer to becoming reality

Space internet service closer to becoming reality
Interestingly, though, a SpaceX filing made with the U. S. Federal Communication Commission (FCC) at the end of August, seeks to modify its original FCC application because of results it discovered in its initial satellite deployment. SpaceX is now asking for permission to “re-space” previously authorized, yet unlaunched satellites. The company says it can optimize its constellation better by spreading the satellites out more. “This adjustment will accelerate coverage to southern states and U.S. territories, potentially expediting coverage to the southern continental United States by the end of the next hurricane season and reaching other U.S. territories by the following hurricane season,” the document says. Satellite internet is used extensively in disaster recovery. Should SpaceX's request be approved, it will speed up service deployment for continental U.S. because fewer satellites will be needed. Because we are currently in a hurricane season (Atlantic basin hurricane seasons last from June 1 to Nov. 30 each year), one can assume they are talking about services at the end of 2020 and end of 2021, respectively.



Windows Defender malware scans are failing after a few seconds

The issue has been widely reported over the past two days on the Microsoft tech support forums, Reddit, and tech support sites like AskWoody, DeskModder, BornCity, and Bleeping Computer. The bug impacts Windows Defender version 4.18.1908.7 and later, released earlier this week. The bug was introduced while Microsoft tried to fix another bug introduced with the July 2019 Patch Tuesday. Per reports, the original bug broke "sfc /scannow," a command part of the Windows System File Checker utility that lets Windows users scan and fix corrupted files. After the July Patch Tuesday this utility started flagging some of Windows Defender's internal modules as corrupted, resulting in incorrect error messages that fooled admins into believing there was something wrong with their Windows Defender installation, and its updates. Microsoft announced a fix for the System File Checker bug in August, but the actual patch was delayed. When the fix arrived earlier this week, it didn't yield the expected results.


What does upstream and downstream development even mean?


If the flow of data goes toward the original source, that flow is upstream. If the flow of data goes away from the original source, that flow is downstream. ... The idea that either upstream or downstream could be superior depends on the commit. Say, for example, the developer of Application B makes a change to the application that adds a new feature unique to B. If this feature has no bearing on Application A, but does have a use in Application D, the only logical flow is downstream. If, on the other hand, the developer of Application D submits a change that would affect all other applications, then the flow should be upstream to the source (otherwise, the change wouldn't make it to applications B or C). ... An upstream flow of data has one major benefit (besides all forks gaining access to the commit). Let's say you're the developer of Application B and you've made a change to the core of the software. If you send that change downstream, you and the developer of D will benefit. However, when the developer of Application A makes a different change to the core of the software, and that change is sent downstream, it could overwrite the commit in Application B.


Soft Skills: Controlling your career

Projecting positivity is also a soft skill. The reality is that a busy IT department will achieve a lot and there is much to focus on. Of the technical people I know most are passionate about what they do. Passion drives excellence but it also has a dark side that we see manifest in various IT "religious wars". It narrows the focus, closes the mind and prevents us from acknowledging any evidence that contradicts our beliefs. Passion is also a big turn off for senior executives who tend to prefer calmness. It is difficult to get the balance right between passion & dispassion. The best advice I have been given is that it is OK to hold strong opinions but important to hold them loosely. By all means be passionate and use it to drive you to put forward the best possible case for your chosen subject but accept that others will have equally passionate views and either, or both, of you may be wrong. If you are not passionate then you won't put forward convincing arguments or test hypothesis with sufficient rigour.


Creating ASP.NET Core Application with Docker Support

Image 1
Docker contains Operating System, Source code, Environment variables (if any) and Dependent components to run the software. So, if anyone wants to run your software, they can simply take the container and get started, without putting effort to do the machine set up to make things work. ... Many times, you must have heard developers saying – it is working fine on my machine, but I don’t know what is missing on your machine or say why the same software is not working on your machine? Such discussions usually pop up during the testing phase and as my personal experience, sometimes it takes hours to identify that small missed out dependency. Here, Docker comes to the rescue. As it is containerization, each and every dependency is packed in the form of containers and is available for both Linux and Windows. Hence, everyone using the software will have the same environment. Basically, the concept of docker has completely vanished the problem of mismatch environments. Isn’t it amazing?


Why businesses would rather lose revenue than data


A big reason for cybersecurity issues is the lack of IT talent in SMBs, the report found. Half of businesses said they only provide a one-time security awareness IT training to staff. To solve for the skills gap, a third of companies (33%) said they currently outsource some of their IT activities, and another 40% said they plan to do so.  Regardless, SMBs need a plan. "With regards to addressing security concerns, it's important to have several layers of security so that there's no way an outside 'silver bullet' can penetrate a system," Claudio said. "Making sure staff are aware of potential security threats, like phishing scams, is also crucial as they will usually be your first line of defense. Patch management and vulnerability assessment are also mission critical." ...  "To support business continuity, it's important to have a great backup and disaster recovery program including off-site data copy in the event of an emergency," Claudio noted. "Again, making sure you have access to the right IT resources and skill sets by utilizing a trusted outsourced service provider is essential."


Oracle goes all in on cloud automation

Talk to the cloud: Oracle rolls out more conversational interfaces at OpenWorld 2019
“Digital assistants and conversational UI are going to transform the way we interact with these applications, and just make things a lot easier to deal with,” Miranda says. They will also enable supply chain managers to check on delivery status, track deviations and report incidents, Oracle’s goal being to enable root-cause analysis of supply chain problems via the chat interface. In HR, Oracle HCM Cloud will chat with employees about onboarding and accessing their performance evaluations, while sales staff will be able to configure quotes using voice commands, Oracle says. Oracle and Amazon are famously combative, but Oracle is starting to adopt the same terminology Amazon uses for its Alexa virtual assistant, referring to extended dialogs to accomplish a goal as “conversations” and tasks that its digital assistants can help with as “skills.” R. “Ray” Wang, founder and principal analyst at Constellation Research, says Oracle’s effort to weave AI into all its apps is paying off. ... “It’s the long-term performance improvement of feedback loops. The next best actions are more than rudimentary. Think of the Digital Assistants plus Intelligent Document Recognition, and predictive planning as all tools to help drive more automation and augmented decisions in enterprise apps.”


Strengthen Distributed Teams with Social Conversations

"Cognitive trust is based on the confidence you feel in another person’s accomplishments, skills, and reliability while affective trust, arises from feelings of emotional closeness, empathy, or friendship." In your team, trust might be developed and sustained between individuals in different ways. Some of you will be looking out for how much others fulfill their offer of help, whether they deliver their work on time, and if their work is of high quality. Meanwhile, others will be looking for a more personal or social connection, looking for things they have in common with others—which is easier to find out during real-time conversations. Getting to know each other well requires having a mental image of the person, hearing their voice, seeing their facial expressions, and online meetings can help us achieve this. In this article, I suggest two ways to use meetings to strengthen your team relationships—incorporate social conversations into your scheduled meetings and hold online meetings for the specific purpose of reconnecting as colleagues.


DevSecOps veterans share security strategy, lessons learned


Once DevOps and IT security teams are aligned, the most important groundwork for improved DevOps security is to gather accurate data on IT assets and the IT environment, and give IT teams access to relevant data in context, practitioners said. "What you really want from [DevSecOps] models is to avoid making assumptions and to test those assumptions, because assumptions lead to vulnerability," Vehent said, recalling an incident at Mozilla where an assumption about SSL certificate expiration data brought down Mozilla's add-ons service at launch. ... Once a strategy is in place, it's time to evaluate tools for security automation and visibility. Context is key in security monitoring, said Erkang Zheng, chief information security officer at LifeOmic Security, a healthcare software company, which also markets its internally developed security visibility tools as JupiterOne. "Attackers think in graphs, defenders think in lists, and that's how attackers win," Zheng said during a presentation here. "Stop thinking in lists and tables, and start thinking in entities and relationships."


Cisco spreads ACI to Microsoft Azure, multicloud and SD-WAN environments

access control / authentication / privileges / managing permissions
Key new pieces of ACI Anywhere include the ability to integrate Microsoft Azure clouds and a cloud-only implementation of ACI. Cisco has been working closely with Microsoft, and while previewing the Azure cloud support earlier this year it has also added Azure Kubernetes Service (AKS) to managed services that natively integrate with the Cisco Container Platform. With the Azure cloud extension the service uses the Cisco Cloud Cloud APIC, which runs natively in Azure public cloud to provide automated connectivity, policy translation and enhanced visibility of workloads in the public cloud, Cisco said. With new Azure extensions, customers can tap into cloud workloads through ACI integrations with Azure technologies like Azure Monitor, Azure Resource Health and Azure Resource Manager to fine-tune their network operations for speed, flexibility and cost, Cisco stated. As part of the Azure package, the Cisco Cloud Services Router (CSR) 1000V brings connectivity between on-premises and Azure cloud environments.




Quote for the day:

"The leadership team is the most important asset of the company and can be its worst liability" -- Med Jones


Daily Tech Digest - September 18, 2019

The Seven Patterns Of AI

The Seven Patterns of AI
From autonomous vehicles, predictive analytics applications, facial recognition, to chatbots, virtual assistants, cognitive automation, and fraud detection, the use cases for AI are many. However, regardless of the application of AI, there is commonality to all these applications. Those who have implemented hundreds or even thousands of AI projects realize that despite all this diversity in application, AI use cases fall into one or more of seven common patterns. The seven patterns are: hyperpersonalization, autonomous systems, predictive analytics and decision support, conversational/human interactions, patterns and anomalies, recognition systems, and goal-driven systems. Any customized approach to AI is going to require its own programming and pattern, but no matter what combination these trends are used in, they all follow their own pretty standard set of rules. ... While these might seem like discrete patterns that are implemented individually in typical AI projects, in reality, we have seen organizations combine one or more of these seven patterns to realize their goals. By companies thinking of AI projects in terms of these patterns it will help them better approach, plan, and executate AI projects. In fact, emerging methodologies are focusing on the use of these seven patterns as a way to expedite AI project planning.



Aliro aims to make quantum computers usable by traditional programmers


Stages of quantum computing are generally divided into quantum supremacy—the threshold at which quantum computers are theorized to be capable of solving problems, which traditional computers would not (practically) be able to solve—is likely decades away. While quantum volume, a metric that "enables the comparison of hardware with widely different performance characteristics and quantifies the complexity of algorithms that can be run," according to IBM, has gained acceptance from NIST and analyst firm Gartner as a useful metric. Aliro proposes the idea of "quantum value," as the point at which organizations using high performance computing today can achieve results from using quantum computers to accelerate their workload. "We're dealing with enterprises that want to get business value from these machines…. "We're not ready for many levels of abstraction above the quantum hardware, but we're ready for a little bit. When you get down to the equivalent of the machine language, these things are very, very different, and it's not just what kind of qubits they are. It's noise characteristics, it's connectivity," Ricotta said. "Riggeti and IBM Q machines both use superconducting Josephson junctions around the same number—approximately, the same order of magnitude of qubits—but they are connected in different ways ..."


New hacking group targets IT companies in first stage of supply chain attacks


In two of the attacks, researchers found that hundreds of computers were compromised with malware, indicating that the attackers were simply infecting all the machines they could throughout the organisations in order to find key targets. The most recently recorded activity from Tortoiseshell was in July 2019, with attacks by the group identified by a unique custom payload: Backdoor.Syskit. This malware is built in both Delphi and .NET programming languages and secretly opens an initial backdoor onto compromised computers, allowing attackers to collect information including the IP address, the operating system version and the computer name. Syskit can also download and execute additional tools and commands, and Tortoiseshell attacks also deploy several publicly available tools as information stealers to gather data on user activity. While it remains uncertain how the malware is delivered, researchers suggest that it could potentially be distributed via a compromised web server, because in one instance the first indication of malware on the network was a compromised web shell – something that can provide an easy way into a targeted network.


How Ransomware Criminals Turn Friends into Enemies

As someone whose job it is to learn as much as possible about the online criminal ecosystem, I often spot trends before they make mainstream headlines. This type of attack was high on my list of attacks likely to increase. Supply chain attacks aren't new. They've been increasing in frequency, however, and gaining more attention. While there are many types of supply chain attacks, this particular type — compromising a service provider to gain access to its customers — is becoming more popular among skilled ransomware crews. ... Managing IT can be hard, especially for small and midsize businesses lacking the necessary resources. It probably seemed like a great idea for these small dental practices to outsource IT to Digital Dental Record. They're not alone. The managed services industry is growing extremely fast with businesses struggling to manage the technology required to run a modern establishment. With attacks on MSPs on the rise, MSPs need to step up their security game, regardless of the kind of specialized services they provide.


AI in cyber security: a necessity or too early to introduce?

AI in cyber security: a necessity or too early to introduce? image
Dr Leila Powell, lead security data scientist from Panaseer, agrees that “the key challenge for most security teams right now is getting hold of the data they need in order to get even a basic level of visibility on the fundamentals of how their security program is performing and how they measure up against regulatory frameworks like GDPR. This is not a trivial task! “With access to security relevant data controlled by multiple stakeholders from IT to MSSPs and tool vendors there can be a lot of red tape on top of the technical challenges of bringing together multiple siloed data sources. Then there’s data cleaning, standardisation, correlation and understanding — which often require a detailed knowledge of the idiosyncrasies of all the unique datasets. “As it stands, once all that work has gone in to data collection, the benefits of applying simple statistics cannot be underestimated. These provide plenty of new insights for teams to work through — most won’t even have the resources to deal with all of these, let alone additional alerting from ML solutions.


2019 Digital operations study for energy

Looking ahead to the next five years, the picture improves somewhat and offers more hope for the utilities sector. For instance, of the EMEA utilities surveyed by Strategy&, 5 percent said they had already implemented AI applications and another 9 percent sa they had piloted such programs. That compares with 20 percent and 6 percent, respectively, for chemicals companies. But through 2024, including planned technologies, AI adoption in the utilities sector may increase by another 15 percent, according to the survey, and that would be on par with chemicals companies and just below oil and gas AI implementation. ... Many utilities make the mistake of trying to implement too many ambitious digital strategies at the same time and end up spreading their financial and staff resources, as well as their capabilities, too thin. A better approach is to define the three to five critical digitization efforts that are strategically essential to defending and expanding competitive advantage among startups and established power companies.


Microsoft brings IBM iron to Azure for on-premises migrations

Microsoft brings IBM iron to Azure for on-premises migrations
Under the deal, Microsoft will deploy Power S922 servers from IBM and deploy them in an undeclared Azure region. These machines can run the PowerVM hypervisor, which supports legacy IBM operating systems, as well as Linux. "Migrating to the cloud by first replacing older technologies is time consuming and risky," said Brad Schick, CEO of Skytap, in a statement. "Skytap’s goal has always been to provide businesses with a path to get these systems into the cloud with little change and less risk. Working with Microsoft, we will bring Skytap’s native support for a wide range of legacy applications to Microsoft Azure, including those dependent on IBM i, AIX, and Linux on Power. This will give businesses the ability to extend the life of traditional systems and increase their value by modernizing with Azure services." As Power-based applications are modernized, Skytap will then bring in DevOps CI/CD toolchains to accelerate software delivery. After moving to Skytap on Azure, customers will be able to integrate Azure DevOps, in addition to CI/CD toolchains for Power, such as Eradani and UrbanCode.


Prepare for cloud security and shared responsibility


IT infrastructure teams typically control the platform from the ground up and through the OS layer. Admins work with security teams to ensure platforms are hardened and adhere to compliance needs. After the platform is built, infrastructure and security teams turn it over to the dev or application owners for final installations and deployments. Application owners still work with an infrastructure team to ensure security and compliance measures are maintained through the deployment process. Ideally, the platform gets a final verification from the security team. The same parties will still be involved and maintain that level of ownership and responsibility even if an organization uses automation. But this process gets upended when a cloud provider gets involved. AWS manages the hypervisor, hardware and, in some cases, the OS. This means the deployment process starts in the middle of the traditional application lifecycle rather than at the beginning. Admins have to find a way to contribute in an ecosystem where the infrastructure is run by another party.


Digital dexterity: What it is, why your organization needs it, and how CIOs can lead the charge


If you're not sure what digital dexterity is, you aren't alone. Craig Roth, Gartner Research vice president, explained it as "the ability and ambition to use technology for better business outcomes."  That definition can still seem a bit fuzzy if you aren't sure where ability and ambition come in to the successful use of tech in business, but digging down just a bit helps make the whole thing more understandable. Helen Poitevin, vice president and analyst at Gartner, expands the definition of digital dexterity by adding that it's less about tech skills and more about "a specific set of mindsets, beliefs and behaviors." ... So, where does the CIO fit into all of this? They're basically the cornerstone of the entire concept, said Daniel Sanchez Reina, senior director and analyst at Gartner. "The CIO will play a key role in supporting desired behaviors and changing the processes, procedures, policies and management practices that shape how work gets done to encourage desired behaviors." It can be tough to transform an entire organization from one that resists, or at the very least grudgingly accepts, new technology. CIOs have a tough road ahead of them, but that doesn't mean it's impossible.


New ransomware strain uses ‘overkill’ encryption to lock down your PC


FortiGuard Labs says that 2048 and 4096 strings are generally more than adequate to encrypt and secure messages, and so the use of an 8192 size is "overkill and inefficient for its purpose." "Using the longer key size adds a large overhead due to significantly longer key generation and encryption times [...] RSA-8192 can only encrypt 1024 bytes at a time, even less if we consider the reserved size for padding," the researchers note. "Since the configuration's size will surely be more than that due to the fact that it contains the encoded private key, the malware cuts the information into chunks of 1000 (0x3e8) bytes and performs multiple operations of the RSA-8192 until the entire information is encrypted." The heavy use of encryption means that it is "not practically possible" to decrypt a compromised system, according to the cybersecurity firm. This is unfortunate, as decryption programs offered by cybersecurity firms can sometimes be the only way to recover files lost to ransomware infections without paying up.



Quote for the day:


"Don't measure yourself by what you have accomplished. But by what you should have accomplished with your ability." -- John Wooden


Daily Tech Digest - September 17, 2019

Doing Digital Transformation Right

Image: WriteStudio - stock.adobe.com
If you are one of the organizations asking these questions, chances are that you are what McKinsey refers to as a "digited incumbent," defined as an incumbent business competing substantially in new ways through digitization (more than 20% of your business is digital and you are launching new digital businesses while transforming the core). If you fit into this category, you are already on the road to success. McKinsey said that digital incumbents are twice as likely as traditional incumbents to experience organic revenue growth of 25% or higher. McKinsey defines traditional incumbents as those that compete primarily in traditional, non-digital ways -- more than 80% of their business isn't digital. Top performers are making three "bold moves,"  ... "When companies digitize the core business, our research shows, strong IT capabilities help enormously," the report says. "According to respondents in a survey on the IT function's effectiveness, companies with top performance on core IT tasks have made more progress than other companies in becoming fully digital and mastering key digital activities."



SCsep700x400
APIs are not new. What has changed is how they are being adopted to facilitate open banking. Today, banks, their clients and their partners can share data and integrate ecosystems securely – and often in real time. In the corporate-to-bank space, the use of APIs is becoming mainstream. This is in part due to well-publicized success in cases such as the insurance industry (for new policies, renewals and claims handling) and in facilitating ride sharing, food delivery and other new consumer models. However, solutions are now emerging across industries – from the immediate release of a car at a showroom through to subscription models for consumer products. Ultimately, companies are seeking to create integrated, frictionless client-centric experiences. To date, APIs have been used mainly to enrich corporate-to-bank integration. However, there is now an opportunity to improve banks’ engagement with their financial institution clients. Today, much of this interaction takes place via Swift, which prescribes how often and what type of data can be exchanged.


Employees say companies use IT for pure profit, not worker empowerment

Texting woman
Upliftingly, 52.7% said they believed companies only introduced new technology if they could see it turning a profit, rather than if it would empower their employees to become more productive. Some might gaze at this and consider that, if the employees became more productive, their companies might be more profitable. It seems, though, that the employees aren't merely meaning that they'd do more work better. In this study, you see, almost one-third said having better IT tools makes them happier. It's rarely wise underestimating the value of a happy employee. In these times of relatively full employment, retaining cheerful employees would seem at least as important as squeezing out an additional 40 shillings of profit. Even employees with golden collars -- and, for all I know, golden cuffs -- know that they are threatened by the inevitable Dance of the Apocalypso, when artificial intelligence will be routinely preferred to its human counterpart. Especially because it'll be robots making those decisions. Amusingly, however, a recent survey showed that most people would rather be replaced by humans, just as they'd prefer their co-workers to be replaced by robots.


3 strategies to simplify complex networks

3 strategies to simplify complex networks
Networks generate massive amounts of data that can use useful for operating the environment. The problem is that people can’t analyze the data fast enough to understand what it means – but machines can. This is where network professionals must be willing to cede some control to the computers. The purpose of machine learning isn’t to replace people, but to be a tool to let them work smarter and faster. Juniper acquired Mist Systems earlier this year to provide machine learning based operations to Wi-Fi, which is a great starting point because Wi-Fi troubleshooting is very difficult. ... The long-term goal of network operations is akin to a self-driving car where the network runs and secures itself. However, like with a self-driving car, the technology isn’t quite there yet. In the auto industry, there are many automation features, such as parallel park assist and lane change alerts that make drivers better. Similarly, network engineers can benefit by automating many of the mundane tasks associated with running a network, such as firmware upgrades, OS patching, and other things that need to be done but offer no strategic benefits.


Why Fintech is the Next Ad Industry Disruptor

FinTech and Digital Marketing
With in-store sales still accounting for 90% of all retail sales and U.S. consumers spending more than $3.1 trillion in offline sales last year, the real need of ad innovation is centered around better capturing in-store activity. Marketers are in desperate need of the same level of disruption e-commerce has seen but to the physical world. A means for them to bridge their omni-channel marketing efforts and tie both online and offline tactics to the trackable in-store purchases. Simultaneously, credit and debit cards have increasingly become consumers’ purchase method of choice. Even for consumers who have a complete distrust in banks, new options like prepaid debit cards have eased the barrier to plastic. In fact, the total dollar value loaded on prepaid cards was expected to be $112 billion in 2018.  These payment options are also impacting retailers. With more consumers using credit or debit cards either by swipe or tied to their mobile wallets, cashless stores are taking root and becoming the norm. And new payment entrants -like Square, Stripe and Shopify - have streamlined and modernized POS systems, allowing retailers and merchants an easily rip and replace solution for their dated cash registers. 


The Gap Between Strong Cybersecurity & Demands For Connectivity Is Getting Massive

uncaptioned
Although outbreaks related to these two superbugs of the cyberworld originally happened over two years ago, a recent VxWorks advisory was issued to serve as a warning that millions of devices could fall victim to a similar outbreak soon. The reason is that the devices run on older versions of the Windows OS, which makes them unpatchable. Therefore, those traditional security solutions will not stand up for the fight. This latest announcement demonstrates how urgency is the course of action in these times, not complacency. Responsible IT leaders need to step away from their dependency on outmatched, overutilized technologies from yesteryear and beyond. 1995 was a simpler time, where firewalls and VPNs were state of the art and served our needs pretty well. But networking in 2019 is a whole new ball of wax. The operational technology and information technology that used to be separate are now converging. Achieving a successful hybrid IT/OT environment clearly includes a strong cybersecurity play, but that isn’t the only major area of concern.


How a PIA Can CYA

Image: adiruch na chiangmai via Adobe Stock
More than a diagnostic tool or compliance checklist, PIAs are essentially templated questionnaires that help organizations identify their privacy risks are with information they collect, use, or store, says Rebecca Herold, CEO of the Privacy Professor, a security consultancy. PIA templates typically have some combination of multiple choice and open-ended questions. While often administered quarterly, PIAs can be done more frequently or after a breach or suspicious incident. But mostly, PIAs help expose potential privacy issues that may get overlooked in the rush to market. Herold recalls an organization she worked with that developed a saliva test to detect concussions. Unlike doctors and hospitals that are subject to federal privacy protections, this organization was HIPAA-exempt and hadn't really thought through the ramifications of the data it wanted to collect.  Not surprisingly, consumers became concerned about who'd be able to access the saliva test results.


How to handle anxiety as a tech professional


Anxiety is the most common form of mental health issues in the US, impacting 40 million adults every year, according to the Anxiety and Depression Association of America. While anxiety disorders are very treatable, that doesn't make them any easier to handle, especially in the workplace.  Handling anxiety in daily life is difficult enough, but is even more challenging when attempting to be productive in a formal working environment. Work-induced stress is one of the leading causes of anxiety disorders, the Mayo Clinic found. This anxiety can have a major impact on an employee's performance, oftentimes resulting in burnout. More than half of US employees (55%) experience burnout at work, a recent University of Phoenix study reported, and anxiety (67%) was cited as the no.1 cause. Negative work environments and task overload can make completing assignments unbearable, causing many employees to quit.  These stressors are sometimes even deeper for tech professionals in non-technical organizations, said Nina LaRosa, marketing director of workplace safety, health and HR online training company Moxie Media.


10 things you need to know about MU-MIMO Wi-Fi

 MU-MIMO Wi-Fi.
Like with 11ac, wireless devices aren't required to have multiple antennas to receive MU-MIMO streams from wireless routers and APs. If the wireless device has only one antenna, it still can receive one MU-MIMO data stream from an AP. However, with uplink MU-MIMO, wireless devices are required to have a minimum of two antennas to transmit with MU-MIMO back to the AP or wireless router, even for one stream connections. More antennas would allow a device to support more simultaneous data streams (typically one stream per antenna), which would be good for the device's Wi-Fi performance. However, including multiple antennas in a device requires more power and space, and adds to its cost. It would take eight antennas to take full advantage of the 11ax features. ... Although legacy 11n and 11ac Wi-Fi devices won’t directly see any range or performance improvement of their connections to 11ax APs or wireless routers, they can see an indirect benefit. Remember, Wi-Fi is all about airtime: the faster any device is served, the more time there is for other devices.


Life After Snowden: US Still Lacks Whistleblowing Rules

Debate over the mass surveillance programs being run by the U.S. government and its Five Eyes partners continues. Snowden also revealed that the U.S. government - together with allied agencies, including Britain's GCHQ - was intercepting en masse worldwide data flowing to technology giants' data centers. Those revelations led Apple, Facebook, Google and Microsoft to begin encrypting user data by default as well as to design messaging services that are end-to-end encrypted (see: Crypto Wars Continue, as Feds Seek Messenger Backdoor). Debate over Snowden and the actions he took also continues. But some of those with intelligence experience say that for someone who witnesses wrongdoing at the NSA or another intelligence agency, there is no way to bring such wrongdoing to the attention of someone with oversight without opening themselves up to prosecution. "Did I break the law? Again, what's the question that's more important here? Was the law broken or was that the right thing to do?" Snowden tells CBS in a Monday interview.



Quote for the day:


"The key to successful leadership today is influence, not authority." -- @KenBlanchard


Daily Tech Digest - September 16, 2019

What is Computer Vision And The Amazing Ways It’s Used In Business

What is Computer Vision And The Amazing Ways It's Used In Business
Many car manufacturers from Ford to Tesla are scrambling to get their version of the autonomous vehicle into mass production. Computer vision is a critical technology that makes autonomous vehicles possible. The systems on autonomous vehicles continuously process visual data from road signs to seeing vehicles and pedestrians on the road and then determine what action to take. Computer vision in medicine helps in diagnosing disease and other ailments and extends the sight of surgeons during operations. There are now smartphone apps that allow you to diagnose skin condition using the phone's camera. In fact, 90 percent of all medial data is image-based—X-rays, scans, etc. and a lot of this data can now be analyzed using algorithms. Digital marketing: By using computers to sort and analyze through millions of online images, marketers can bypass traditional demographic research and still target marketing to the right online audience and do this work dramatically quicker than humans could. Marketers even use computer vision to ensure ads are not placed near content that is contradictory or problematic for its audience.



Research explores economic benefits of full-fibre and 5G at local level


“Knowledge-intensive sectors are shown to benefit most,” said the report. “Education and health sectors have also been shown to experience larger-than-average productivity impacts of increased connectivity… [and] there is also a likelihood for full-fibre and 5G in particular to lead to productivity improvements in industrial and manufacturing settings.” Therefore, an area with a particularly high density of knowledge workers will benefit more than area with a relatively low density. Likewise, an area with a high concentration of manufacturing businesses, such as the West Midlands, where the UK’s first regional testbed for 5G is taking place, will benefit more than an area with a low concentration. “Many reports already estimate the benefits that full-fibre and 5G can bring to the UK economy,” said BSG CEO Matthew Evans. “But what does it mean for Manchester, Merthyr Tydfil or the Midlothian hills?


Brain hack devices must be scrutinised, say top scientists

Neurons in brain
 In future, "people could become telepathic to some degree" and being able to read someone else's thoughts raises ethical issues, experts said. This could become especially worrying if those thoughts were shared with corporations. ... Among the risks highlighted by the report was the idea of thoughts or moods being accessed by big corporations as well as the bigger question about whether such devices fundamentally change what it means to be human. Dr Tim Constandinou, director of the next generation neural Interfaces (NGNI) Lab, at Imperial College London and co-chair of the report, said: "By 2040 neural interfaces are likely to be an established option to enable people to walk after paralysis and tackle treatment-resistant depression, they may even have made treating Alzheimer's disease a reality. "While advances like seamless brain-to-computer communication seem a much more distant possibility, we should act now to ensure our ethical and regulatory safeguards are flexible enough for any future development. "In this way we can guarantee these emerging technologies are implemented safely and for the benefit of humanity."


Microservices Migration Use Cases

By migrating to microservices IT will enable your teams to become more innovative as they are freed up from daily mundane tasks supporting and developing on a legacy system that simply cannot compete in the competitive world we are in today. The other primary benefit customers see is scale — an elastic environment that allows your business to auto-scale takes the worry out of slow performance during critical events or peak traffic seasons. This could be a retail outlet during Black Friday/Cyber Monday, or an insurance company during a natural disaster or macro-economic changes that cause a flurry of activity on Wall Street. We create value on mobile apps with external development providing an entry point to enter the data center and consume our APIs. We empower from hundreds to thousands of microservices to happen with a self-service platform for developers to publish new services and new versions as needed. All of this is automated allowing the platform team to set boundaries on what teams can do.


It’s not easy going green – but the Internet of Things can help


Only through cross-system communication is compliance and energy efficiency possible. Much of the IoT’s value lies in its ability to integrate the various, complex components and IT systems that make up any modern building or facility. When building systems can ‘talk’ with each other, the resilience of the infrastructure is strengthened. This provides access to a greater volume of intelligence, leading to more robust compliance and better use of resources. An IoT-connected system enhances an organisation’s pursuit of greater energy efficiency, where the rapid collection of, and reaction to, massive amounts of information is essential. For example, having IoT devices and sensors integrated with a heating, ventilation and air conditioning system means that organisations can collect real-time data on energy consumption and device health. Armed with this information, organisations are empowered to take a fresh look at their current practices, generate business change and create efficiencies that cut costs and emissions. From an energy management perspective, Schneider Electric’s PowerLogic ION9000 is the ideal connected solution.


Open source and open data

First and foremost, our primary mission is “to organize the world’s information and make it universally accessible and useful.” Certainly one obvious way to make information universally accessible and useful is to give it away! Second, making these materials available stimulates scientific research outside of Google. We know we can’t do it all, and we spend a lot of time reading, understanding and often extending work done by others, some of which has been developed using tools and data we have provided to the research community. This mix of competition and cooperation among groups of researchers is what pushes science forward. Third, when we hire new employees, it’s great if they can hit the ground running and already know and use the tools we have developed. Familiarity with our software and data makes engineers productive from their first day at work. There are many more reasons to share research data, but these three alone justify the practice. We aren’t the only internet company to appreciate the power of open data, code, and open research.


New NetCAT CPU side-channel vulnerability exploitable over the network

Hands typing on a laptop keyboard binary code and a hazard symbol on screen.
The culprit is Intel’s Data Direct I/O (DDIO) technology, which gives peripheral devices such as network cards direct access to the processor’s internal cache to achieve better performance, less power consumption, and higher data throughput. Before DDIO, these devices exchanged data with the CPU through RAM, whose latency can be a bottleneck. DDIO was designed with ethernet controllers and fast datacenter networks in mind to allow servers to handle 10-gigabit ethernet (10 GbE) connections and higher. The technology was first introduced in 2011 in the Intel Xeon E5 and Intel Xeon E7 v2 enterprise-level processor families. CPU attacks like Spectre and Meltdown and their many variants have used the CPU cache as a side-channel to infer sensitive data. Researchers from the VUSec group at Vrije Universiteit Amsterdam have now shown that DDIO’s cache access can be exploited in a similar manner. In a new paper released today, the researchers described an attacked dubbed NetCAT which abuses DDIO over the network to monitor access times in the CPU cache triggered by other clients connected to the same server over SSH (Secure Shell).


NHSX emphasises need for ethical patient data access


“NHS and care organisations have an obligation to protect patient data, but in my view, they also have the obligation to make best use of it,” she said. “Collaborations need to benefit everyone involved – patient lives are at stake.” Donnelly also mentioned that “citizen juries” are currently taking place to debate the matter of how patient data should be used what constitutes a fair partnership between the NHS and researchers, charities and industry on uses of patient and operational data from the NHS. “By testing different commercial models against the principles on which our citizens are not prepared to compromise, we hope to reach a consensus on what good looks like and how best we achieve the promised benefits.” In July, a programme was launched by Public Health England and NHSX with the aim to usher in a “new era of evidence-based self-care”, with patients increasingly expected to allow access to their personal data.


Gartner sees blockchain as ‘transformational’ across industries – in 5 to 10 years

Chains of binary data.
"Once it has been combined with the Internet of Things (IoT) and artificial intelligence (AI), blockchain has the potential to change retail business models forever, impacting both data and monetary flows and avoiding centralization of market power," Gartner said. As a result, Gartner believes that blockchain has the potential to transform business models across all industries — but the opportunities demand that enterprises adopt complete blockchain ecosystems. Without tokenization and decentralization, most industries will not see real business value. The journey to create a multi-company blockchain consortium is inherently awkward, Garter said. "Making wholesale changes to decades-old enterprise methodologies is hard to achieve in any situation. However, the transformative nature of blockchain works across multiple levels simultaneously (process, operating model, business strategy and industry structure), and depends on coordinated action across multiple companies."


Rethinking Flink’s APIs for a Unified Data Processing Framework


Flink’s existing API stack consists of the Runtime as the lowest level abstraction of the system that is responsible for deploying jobs and running Tasks on distributed machines. It provides fault-tolerance and network interconnection between the different Tasks in the JobGraph. On top of Flink’s Runtime sit two separate APIs, the DataSet and DataStream APIs. The DataSet API has its own DAG (directed acyclic graph) representation for tying together the operators of a job, as well as operator implementations for different types of user-defined functions. The DataStream API has a different DAG representation as well as its own set of operator implementations. Both types of operators are implemented on a disjointed set of Tasks which are given to the lower-level Runtime for execution. Finally, we have the Table API / SQL which supports declarative-style programming and comes with its own representation of logical operations and with two different translation paths for converting Table API programs to either the DataSet or DataStream API, depending on the use case and/or the type of sources that the program comes with.



Quote for the day:


"Courage is the ability to execute tasks and assignments without fear or intimidation." -- Jaachynma N.E. Agu


Daily Tech Digest - September 15, 2019

Gartner: Get ready for more AI in the workplace

automation iot machine learning process ai artificial intelligence by zapp2photo getty
AI will help out with the more mundane tasks managers already do. “Let's think about what managers do every day: they set schedules, assign work, do performance reviews, offer career guidance, help you access training, they do approvals, they cascade information and they enforce directives,” Cain said. “We can have AI doing a lot of that. “Your manager won't be replaced by an algorithm, but your manager will be using a lot of AI constructs to help improve and to make more efficient a lot of the routine work that they do. We think that that is going to be the combination.” There will also be more intelligence embedded in the workplace, as smart office technologies become more common, said Cain. “First of all, we are going to see workplaces have huge amounts of beacon and sensor networks woven throughout the physical workspace,” he said. “This can be used for space optimization, heating and cooling, energy use, supply replenishment [and] contextual data displays as you navigate the workplace.



Intelligent Field Instruments: The Smart Way to Industry 4.0

Industrial_IoT_2.jpg
A key aspect in realizing a smart factory is the use of field instruments possessing intelligence—so-called smart transmitters. They support factory monitoring and diagnostics as well as networking with additional new field instruments. These transmitters can be distributed over the entire plant, different sensors can be connected, and previously unconnected parts can be monitored. The field instruments form the universal, intelligent basic unit of Industry 4.0. These units will be considered in more detail using the example of an instrument that can be employed with various sensors, such as resistance thermometers, thermocouples, and pressure sensors. Developed from the field instruments commonly in use today, smart transmitters are intelligent field instruments that are either purely loop-fed or supplied with auxiliary energy. A smart transmitter, besides containing other components, utilizes a microprocessor containing the software needed to make a transmitter smart.


How AI Is Changing Cyber Security Landscape and Preventing Cyber Attacks

How Artificial Intelligence Is Changing Cyber Security Landscape and Preventing Cyber Attacks
Organizations have to be able to detect a cyber-attack in advance to be able to thwart whatever the adversaries are attempting to achieve. Machine learning is that part of Artificial Intelligence which has proven to be extremely useful when it comes to detecting cyber threats based on analyzing data and identifying a threat before it exploits a vulnerability in your information systems. Machine Learning enables computers to use and adapt algorithms based on the data received, learning from it, and understanding the consequent improvements required. In a cybersecurity context, this will mean that machine learning is enabling the computer to predict threats and observe any anomalies with a lot more accuracy than any human can. Traditional technology relies too much on past data and cannot improvise in the way that AI can. Conventional technology cannot keep up with the new mechanisms and tricks of hackers the way AI can. Additionally, the volume of cyber threats people has to deal with daily is too much for humans and is best dealt with by AI.


7 key relationships for the transformational CIO

handshake deal vendor management hands business relationship agreement
This last relationship is one of the hardest for the CIO. Board members are often not technologically savvy and are business and/or financially minded. CIOs, on the other hand, are not typically business and/or financially minded. Nor does the CIO typically have exposure to the board of directors. Hence, the challenge with this relationship. Even so, this relationship is key for two reasons: a) differentiated company strategies rely heavily on technology and b) cybersecurity and risk. Like any relationship, relationships do not happen overnight and take time to build. Remember that relationships are one-to-one, not one-to-many. The combination of respect and trust becomes the foundation for each relationship. As the CIO, consider going to where the other person is. Do not expect or ask them to come to you. This is not a statement of physical location but rather a statement of current state. Consider where the other person is and approach the relationship from their perspective. With time, the work put into developing and nurturing these relationships will pay dividends for a long time. The effort also sets a good example for your teams to follow.


Does Education For Entrepreneurs Miss The Mark?


Particular areas of interest for entrepreneurs looking for this kind of just-in-time learning include identifying their customers and understanding their needs, developing and testing prototypes, creating value propositions, defining go-to-market strategies, determining the right profit model and learning from other entrepreneurs how they addressed these issues. In the two to four years it typically takes to launch a venture, it’s likely that founders will struggle with all of these challenges multiple times. It is not unusual for an entrepreneur to revisit these issues every two to three months and seek guidance from other entrepreneurs. This is why I joined forces with my Stanford GSB colleagues Jim Lattin and Baba Shiv to develop Stanford’s latest offering, Embark, a subscription based offering that combines frameworks and insights from our unique position in Silicon Valley with tactical steps necessary to launching or validating a sustainable business. The platform provides video advice from dozens of entrepreneurs about how to use these frameworks and is designed to support thousands of members.


What is incident response management and why do you need it?

The longer it takes an organisation to detect a vulnerability, the more likely it is that it will lead to a serious security incident. For example, perhaps you have an unpatched system that’s waiting to be exploited by a cyber criminal, or your anti-malware software isn’t up to scratch and is letting infected attachments pass into employees’ inboxes. Criminals sometimes exploit vulnerabilities as soon as they discover them, causing problems that organisations must react to immediately. However, they’re just as likely to exploit them surreptitiously, with the organisation only discovering the breach weeks or months later – often after being made aware by a third party. It takes 175 days on average to identify a breach, giving criminals plenty of time to access sensitive information and launch further attacks. As Ponemon Institute’s 2019 Cost of a Data Breach Study found, the damages associated with undetected security incidents can quickly add up, with the average cost of recovery being £3.17 million.


How Artificial Intelligence Will Transform Marketing in 2020

How Artificial Intelligence Will Transform Marketing in 2020
While one attempts to leverage the knowledge of AI to empower marketing, it also helps in fostering relevant and compelling interactions with customers, boost ROI, and affect revenue figures positively. Artificial Intelligence Marketing can function to work with a truckload of data at a much faster rate compared to any marketing team run by humans ever. Thus, finding hidden insights that affect consumer behavior, critical data points, and recognizing purchaser trends are valuable touchpoints for any marketing team to focus upon in order to develop creative content and impact strategy. Though a lot has been said about AI and the future of marketing, it is significant to understand why and how organizations are bent on implementing AI solutions for their marketing wing to prosper. Reportedly, brands who have recently adopted AI for marketing strategy, predict a 37 percent reduction in costs along with a 39 percent increase in revenue figures on an average by the end of 2020 alone. AI provides traditional marketing with tools that make way for personalized and relevant content brought at the right time to impact conversion rates for any business out there.


What Makes A Data Visualisation Elegant?


Perhaps a more sophisticated and flexible modern approach to the somewhat blunt notion of minimalism is that of “refinement”. What’s important is editing and, at times, being courageous or restrained about what you should not include or attempt to do. It’s about finding that moment — perhaps only through experience — where something just ‘feels right’. That leads me to one of my favourite German words, fingerspitzengef├╝hl, which means having an intuitive flair or instinct — a ‘finger tip feeling’ where you just know. Moritz Stefaner mentions another key German word for this discussion, “pragnanz”, as meaning “concise and on point, but also memorable and assertive… so, not minimal for minimalism’s sake, but maximally effective with minimal effort”. Refinement is about being decisive. Possessing the clarity of vision and caring for the little details. This conveys to your viewer that your work has been thought-through and thought-about.


Anomaly detection methods unleash microservices performance


Traditional single or simple n-tier applications require platform and performance monitoring, but microservices add several logical layers to the equation. Along with more tiers come the y and z axes of the scale cube, including Kubernetes or another cluster manager for containers; a service layer and associated tools, such as the fabric and API gateways; and data and service partitioning across multiple clients. To detect and analyze performance problems, begin with the basics of problem identification and cause analysis. The techniques described here are relevant to microservices deployments. Each aims to identify and fix the internal source of application problems based on observable behavior. A symptom-manifestation-cause approach involves working back from external signs of poor performance to internal manifestations of a problem to then investigate likely root causes.



Why the founder of Apache is all-in on blockchain

Data container block with hexagons
As a result, "blockchain technology seemed urgent to get involved in [and] that lined up with these idealistic and pragmatic impulses that I've had—and I think other people in open source have had," he adds. Specifically, it was the emergence of a set of use cases beyond programmable money that drew in Behlendorf. "I think the one that pulled me in was land titles and emerging markets," he recalls. It wasn't just about having a distributed database. It was about having a distributed ledger that "actually supported consensus, one that actually had the network enforcing rules about valid transactions versus invalid transactions. One that was programmable, with smart contracts on top. This started to make sense to me, and [it] was something that was appealing to me in a way that financial instruments and proof-of-work was not." Behlendorf makes the point that for blockchain technology to have a purpose, the network has to be decentralized. For example, you probably want "nodes that are being run by different technology partners or … nodes being run by end-user organizations themselves because otherwise, why not just use a central database run by a single vendor or a single technology partner?" he argues.



Quote for the day:


If you can't handle others disapproval, then leadership isn't for you. -- Miles Anthony Smith