Daily Tech Digest - April 30, 2019

Microsoft tells IT admins to nix 'obsolete' password reset practice

5 password best practices unique passwords authentication
Two years ago, the National Institute of Standards and Technology (NIST), an arm of the U.S. Department of Commerce, made similar arguments as it downgraded regular password replacement. "Verifiers SHOULD NOT require memorized secrets to be changed arbitrarily (e.g., periodically)," NIST said in a FAQ that accompanied the June 2017 version of SP 800-63, "Digital Identity Guidelines," using the term "memorized secrets" in place of "passwords." Then, the institute had explained why mandated password changes were a bad idea this way: "Users tend to choose weaker memorized secrets when they know that they will have to change them in the near future. When those changes do occur, they often select a secret that is similar to their old memorized secret by applying a set of common transformations such as increasing a number in the password." Both the NIST and Microsoft urged organizations to require password resets when there is evidence that the passwords had been stolen or otherwise compromised. And if they haven't been touched? "If a password is never stolen, there's no need to expire it," Microsoft's Margosis said.


4 tips for agile testing in a waterfall world


Begin with the understanding that agile is not about Scrum or Kanban processes in and of themselves; it is a set of values. Even in a non-agile environment, you can apply agile values to daily work. Beyond that, when working in an organization that is undergoing an agile transformation, you as an agile practitioner can introduce specific best practices to help the agile transformation go more smoothly. Finally, when you're working in a truly waterfall environment, adapt your process with an understanding that groups will be resistant to Scrum processes for the sake of Scrum. Instead, bring the advantages of agile to the team by making agile values relevant to the team. Think about the principles of agile and how to achieve them within current organizational processes, or how you might tweak current processes to meet those principles. Here are four tips garnered from what I've found to be successful when adapting agile principles to waterfall environments.


Venerable Cisco Catalyst 6000 switches ousted by new Catalyst 9600

Cisco
The 9600 series runs Cisco’s IOS XE software which now runs across all Catalyst 9000 family members. The software brings with it support for other key products such as Cisco’s DNA Center which controls automation capabilities, assurance setting, fabric provisioning and policy-based segmentation for enterprise networks. What that means is that with one user interface, DNA Center, customers can automate, set policy, provide security and gain assurance across the entire wired and wireless network fabric, Gupta said. “The 9600 is a big deal for Cisco and customers as it brings together the campus core and lets users establish standards access and usage policies across their wired and wireless environments,” said Brandon Butler, a senior research analyst with IDC. “It was important that Cisco add a powerful switch to handle the increasing amounts of traffic wireless and cloud applications are bringing to the network.” ... The software also supports hot patching which provides fixes for critical bugs and security vulnerabilities between regular maintenance releases. This support lets customers add patches without having to wait for the next maintenance release, Cisco says.


Everything done in enterprise information management should drive ROI

The goal here will always be to have the minimal amount of "stuff" doing the maximum amount of "value added things" at the "least cost." This has been a compelling argument for the big data and AI crowd in recent years, but the expense of these solutions in infrastructure, specialized skills and poor implementation has in many ways tainted the message of how to achieve return on investment in the EIM and data insights marketplace... the perception to the business is that sorting data is expensive and needs huge justification. This creates a very challenging environment for enterprise information management innovators committed to the less is more paradigm to business value...so such innovations need to get better at making their case stand out to business leaders... or the money munching will continue unabated and businesses will have no choice but to spend tens of millions of dollars on questionable results.


Seven use cases of IoT for sustainability


A key piece of a smart grid infrastructure, smart meters gather real-time energy data, as well as water and gas data. Rather than waiting for monthly manual readings, businesses and homes with smart meters get real-time data that enables them to make smarter decisions about their energy, water and gas consumption and to modify habits to save money and reduce their carbon footprint. Utility companies also benefit, as systems can be remotely monitored, allowing for better response to problems and efficient maintenance. ... In agricultural scenarios, be it on a farm or an orchard or a building's or resident's lawn, smart irrigation systems monitor soil saturation to prevent over- and under-watering. Water sensors are also instrumental in monitoring water quality, a critical task after floods, hurricanes and other natural disasters to ensure wastewater and chemicals have not tainted potable water supplies. Likewise, IoT sensors embedded into water management infrastructures can monitor local weather forecasts and control drainage to minimize flooding, stormwater runoff or property damage.


On The Future of Tesla and Full Self Driving Cars

The key to moving fast for carmakers is based on making complex trade-offs between backward compatibility and future optionality. And Tesla is the only one who have already demonstrated they can do that masterfully. Tesla is amassing massive amounts of learning from training real world data in shadow mode today. It’s at a scale that makes simulation data obviously weak in comparison. Do you want to ride in a car that has been trained in a simulated environment when there is no steering wheel, or one that learned in the real world? Let’s be honest: It’s hard to tell whether Tesla will emerge the winner in this market. That’s a complex calculus and the industry they play in today is a massively difficult one to succeed in. There are a few ways of looking at this. One is how can they possibly succeed? But another is how can anyone else succeed too? Others don’t have cars on the road and are relying on some future technology that may or may not see the light of day (solid state LiDAR), and will most certainly be obsolete by the time it does.


Intel's Interconnected Future: Combining Chiplets, EMIB, and Foveros


Intel has also uses full interposers in its FPGA products, using it as an easier and quicker way to connect its large FPGA dies to high bandwidth memory. Intel has stated that while large interposers are a catch-all situation, the company believes that EMIB designs are a lot cheaper than large interposers, and provide better signal integrity to allow for higher bandwidth. In discussions with Intel, it was stated that large interposers likely work best for powerful chips that could take advantage of active networking, however HBM is overkill on an interposer, and best used via EMIB. Akin to an interposer-like technology, Foveros is a silicon stacking technique that allows different chips to be connected by TSVs (through silicon vias, a via being a vertical chip-to-chip connection), such that Intel can manufacture the IO, the cores, and the onboard LLC/DRAM as separate dies and connect them together. In this instance, Intel considers the IO die, the die at the bottom of the stack, as a sort of ‘active interposer’, that can deal with routing data between the dies on top.


Huawei's Role in 5G Networks: A Matter of Trust

Security experts are questioning whether restricting high-risk vendors to nonsensitive parts of the network might be a viable security strategy - and whether one nation's choices might have security repercussions for allies. The U.S. has been spearheading a push to ban Chinese telecommunications equipment manufacturing giants, including Huawei, from allies' 5G networks entirely, with one National Security Agency official saying it doesn't want to put a "loaded gun" in Beijing's hands. So far, Australia, New Zealand and Japan have agreed with the U.S. position and barred Chinese telecommunications gear from at least part of their 5G network rollouts. ... On Tuesday, news leaked that the U.K.'s National Security Council voted to allow Huawei to supply equipment for some "noncore" parts of the U.K.'s 5G network, such as antennas, although the government wasn't yet prepared to publicly make that declaration.


How to use Google Drive for collaboration

google drive docs suite logos
Many people think of Google Drive as a cloud storage and sync service, and it is that — but it also encompasses a suite of online office apps that are comparable with Microsoft Office. Google Docs (the word processor), Google Sheets (the spreadsheet app) and Google Slides (the presentation app) can import, export, or natively edit Microsoft Office files, and you can use them to work together with colleagues on a document, spreadsheet or presentation, in real time if you wish. With a Google Account, individuals get free use of Docs, Sheets and Slides and up to 15GB of free Google Drive storage. Those who need more storage can upgrade to a Google One plan starting at $2 per month. Businesses can opt for Drive Enterprise, which also includes Docs, Sheets and Slides, as well as business-friendly features including shared drives, enterprise-grade security, and integration with third-party tools like Slack and Salesforce. Drive Enterprise costs $8 per active user per month, plus $.04 per GB used.


Robots extend the scope of IoT applications

Pepper, a humanoid robot by Softbank Robotics
Robots like humans improve their motor skills with practice. Robots need a test bed where their instructions can be tested and debugged. Simulated test beds are better than physical ones as it is impossible to create a physical representation of every environment where the robot might operate. Isaac Sim is a virtual robotics laboratory and a high-fidelity 3D world simulator. Developers train and test their robots in a detailed, realistic simulation reducing the costs and development time. Robots improve as their decision models are revised to cover new situations that they encounter. Robots operate based on models they were programmed with, but they also send details of unexpected situations back to the cloud for review. This enables developers to refine the robot’s decision-making model to deal with the new conditions. The amount of feedback increases as more robots are deployed, increasing the speed at which all the robots collectively get “smarter.” NVIDIA Nano based robots can report new conditions they encounter to AWS IoT Greengrass modeling platform which lets them act locally on the data they generate, while still using the cloud for management, analytics, and storage.



Quote for the day:


"Being responsible sometimes means pissing people off." -- Colin Powell


Daily Tech Digest - April 29, 2019

10 Ways Technology is Transforming Warehouses


Sustainability is a hot-button topic these days, and this focus is changing nearly every industry in the world. Technology can help companies reduce energy consumption, cut down on product waste and lessen emissions while aligning with federal and local rules and regulations. Replacing traditional fluorescent lighting with LED alternatives can reduce power usage while saving the facility money. Smart warehouse designs rely on monitors to regulate power usage, becoming more energy efficient over time by preventing power ghosts from drawing energy when they’re not in use. Technology is helping warehouses become more sustainable, both in house and in their dealings with other facilities. Handheld devices, such as barcode scanners, have always been a part of the logistics and distribution industry, but recent advances have helped these devices become more efficient and useful than ever before. Warehouses that still rely on manual counts and physical paperwork should consider transitioning to digital inventories and handheld devices equipped with RFID scanners and GPS to increase efficiency and reduce theft and inventory loss.


How to write a good data governance policy

I find that getting principles agreed is a lot easier than asking a group pf people what they want included in a data governance policy. Plus the conversation around the principles will give you a really good idea about what they want covered in their policy. Once you've drafted and circulated those principles for feedback, you should be able to make amendments and agree a list of principles. With the principles agreed drafting your policy in accordance is fairly straightforward. However, don't make the mistake of believing that once it is drafted that everyone will immediately approve it because they already agreed the principles. Seeing the detail in black and white often gives rise to more questions, suggestions or changes from your key stakeholders At this point, I really have to emphasize that for data governance to be successful, you need the senior stakeholders engaged. So the answer is not to tell them they're wrong, or to railroad them into accepting what you want to have in the data governance policy.


Small business cybersecurity: The case for MSSPs

encryption.jpg
Clearly, the industry favors using AI and automated tools, which requires qualified personnel—something small businesses often lack. The good news is most MSSPs enhance their managed approach by using automated-security technology. This likely gives MSSPs the edge with small-business owners according to Canner. "By hiring a managed security provider, your enterprise could save money in the long term. Not only will you save on the costs of finding, hiring, and training new cybersecurity personnel, your enterprise can also reduce the number of cybersecurity members on staff." Venkatesh Sundar, founder and CMO at the MSSP Indusface, in this Trak.in article suggests that small businesses with web applications (most nowadays) may especially benefit from MSSPs that employ Managed Web Application Firewalls (MWAFs) as the first line of defense against malicious actors. "A MWAF ... supports custom and complex rules based on the needs of your business," writes Sundar. "An intelligent, managed WAF gives decision-making power to you or the security analyst to either block, flag or challenge requests."


Froid and the relational database query quandry with Dr. Karthik Ramachandra

Dr. Karthik Ramachandra
If you look at relational databases today, the primary way to interact with the database is through this language called SQL, or structured query language, which falls under this declarative paradigm of programming, which basically says the user needs to tell the system what they need in this declarative high-level language, and the system figures out an efficient way to do what the user has asked. So that’s sort of one main paradigm, or the primary way we interact with databases today. That comes with the advantage that, you know, the users can stay at a higher level of abstraction, not having to go to the detailed implementation of how things are done. And it also allows the system to optimize and come up with efficient algorithms to solve the query or the question that the user is trying to ask. That is one paradigm, and on the other side, we have this imperative program style which is a slightly lower level of abstraction in the sense you are basically telling the system how to go about doing what you want it to do. And, as a result, you’re sort of binding the system to implement it in the way you are telling it to do.


Forget about artificial intelligence, extended intelligence is the future


While one of the key drivers of science is to elegantly explain the complex and increase our ability to understand, we must also remember what Albert Einstein said: “Everything should be made as simple as possible, but no simpler.” We need to embrace the unknowability – the irreducibility – of the real world that artists, biologists and those who work in the messy world of liberal arts and humanities are familiar and comfortable with. ... In order to effectively respond to the significant scientific challenges of our times, I believe we must respect the many interconnected, complex, self-adaptive systems across scales and dimensions that cannot be fully known by or separated from observer and designer. In other words, we are all participants in multiple evolutionary systems with different fitness landscapes at different scales, from our microbes to our individual identities to society and our species. Individuals themselves are systems composed of systems of systems, such as the cells in our bodies that behave more like system-level designers than we do.


These are the industries most likely to be taken over by robots

A humanoid robot works side by side with employees in the assembly line at a factory of Glory Ltd., a manufacturer of automatic change dispensers, in Kazo, north of Tokyo, Japan, July 1, 2015. Japanese firms are ramping up spending on robotics and automation, responding at last to premier Shinzo Abe's efforts to stimulate the economy and end two decades of stagnation and deflation. Picture taken July 1, 2015. REUTERS/Issei Kato      TPX IMAGES OF THE DAY      - GF10000147191
Workers in industry sectors like food service and manufacturing spend much of their time doing physical tasks in a predictable environment, and so are susceptible to automation. Meanwhile, industries like education and health care involve much more interpersonal work and application of deep expertise, competencies which current robots and software lack. McKinsey pointed out that their analysis focused on what tasks could potentially be automated using current technology, which doesn't necessarily mean that these jobs actually will end up being more heavily done by robots and software. Other economic and social concerns, like the cost of labor relative to new investment in advanced machines and the public's willingness to have robots do things like serve them food, are likely to be big factors in whether or not various jobs and tasks actually do become automated, according to the report.


The growing demand for managed detection and response (MDR)

The growing need for managed detection and response (MDR)
According to ESG research, 82% of cybersecurity professionals agree that improving threat detection and response (i.e. mean-time to detect (MTTD), mean-time to respond (MTTR), etc.) is a high priority at their organization. Furthermore, 77% of cybersecurity professionals surveyed say business managers are pressuring the cybersecurity team to improve threat detection and response. So, what’s the problem? Threat detection and response ain’t easy. In fact, 76% of those surveyed claim that threat detection and response is either much more difficult or somewhat more difficult than it was two years ago. Why? Cybersecurity professionals point to issues such as an upsurge in the volume and sophistication of threats, an increasing cybersecurity workload, and a growing attack surface. Oh, and let’s not forget the impact of the cybersecurity skills shortage. Many firms lack the right staff and skills to make a significant dent in this area. Rather than deploying yet another point tool or muddle through, many CISOs are turning to third-party service providers for help, making managed detection and response (MDR) one of the fastest-growing segments in the cybersecurity market.


How cloud services can empower the future of work

More often than not, businesses are stuck with legacy applications and tools. As a result, employees rely on email and word processing for business communication. They depend on shared network drives and content management systems to store, organize, secure and access files. Users connect first to one application and then another to schedule appointments, develop plans, allocate resources, track results, make payments, update images and accomplish a whole host of business activities that are integral parts of their everyday jobs. Today's employees work in an application-centric -- not a task-centric -- environment.  Cloud-powered connectivity promises to transform the future of work. Modest additions to existing personal productivity tools and enterprise applications can go a long way toward modernizing the workplace. Employees and workgroups can focus on their immediate tasks at hand, save time and enhance productivity.


Why wearables, health records and clinical trials need a blockchain injection

laptop analytics data scientist analytics process doctor electronic medical records remote physician
"Patients can become owners of data and, with their consent, share data with practitioners and allow them to sell anonymous data to buyers," said Mehta, who took part in the blockchain-and-healthcare panel. By enabling patients to add their own details around lifestyle – what they eat, how much they exercise and sleep, a personal health record would offer physicians greater personal insights for more targeted clinical decision making. In order to securely record, share and crunch vast amounts of sensitive data coming from external sources such as wearable medical devices and fitness trackers, a standardized database with artificial intelligence capabilities is needed. ... Blockchain uses hashing, the creation of a unique digital signature for each encrypted block of data added to an electronic distributed ledger. The hashes map back to encrypted patient data as it's added sequentially to a blockchain ledger – and because it's immutable, it creates an audit trail for government oversight. Smart contracts – self-executing business automation apps – can also be used atop blockchain to automatically ingest and process new data.


Millennials, changing meeting priorities drive huddle room trends


Huddle rooms equipped with conferencing technology enable small meetings to happen without taking up an entire boardroom. Nearly 65% of people believe that at least half of huddle rooms within an organization need video conferencing tools, according to a Cisco-sponsored report from market research firm Dimensional Research. "We've been talking about getting video conferencing out of boardrooms and out to the masses for a while now," said David Maldow, founder of market research firm Let's Do Video. "Huddle rooms make video technology accessible to everyone." Meeting culture has become far less formal and scheduled. Teams now are focusing on increased productivity, requiring spaces that they can access quickly for impromptu meetings or last-minute brainstorming sessions. Unlike boardrooms, which are typically designed for larger-scale planned meetings, the trend for huddle rooms and ad hoc spaces is to design them to fit into smaller team workflows. According to the Dimensional Research study, 55% of respondents said that meetings held in huddle rooms helped increase productivity.



Quote for the day:


"Uncertainty is not an indication of poor leadership; it underscores the need for leadership." -- Andy Stanley


Daily Tech Digest - April 28, 2019

It’s all about people: Dispelling the five myths of process automation

city-3701799-geralt-pixabay
In a memorable scene from the movie “The Founder” about the origin of McDonald’s, the McDonald brothers plot the layout of their restaurant in a life-sized mockup drawn in chalk on a parking lot. This example of process optimization was certainly “lean,” but it involved no software whatsoever. Today, in contrast, optimizing business processes almost always means automating them – at least in part. And when we say automation, we mean with software. Just what software, however, is an open question, as today’s frothy software marketplace has spawned several contenders. From the business process automation or BPA of the last decade to today’s robotic process automation or RPA to the latest entrant, digital process automation or DPA, information technology decision makers have a plethora of options to choose from. Be warned: This is a clear-cut case of caveat emptor. With the help of the big IT analyst firms, the providers in these overlapping categories have stirred up massive confusion. Let’s clear up the biggest misconceptions.


Confronting the risks of artificial intelligence
Because AI is a relatively new force in business, few leaders have had the opportunity to hone their intuition about the full scope of societal, organizational, and individual risks, or to develop a working knowledge of their associated drivers, which range from the data fed into AI systems to the operation of algorithmic models and the interactions between humans and machines. As a result, executives often overlook potential perils or overestimate an organization’s risk-mitigation capabilities. It’s also common for leaders to lump in AI risks with others owned by specialists in the IT and analytics organizations. Leaders hoping to avoid, or at least mitigate, unintended consequences need both to build their pattern-recognition skills with respect to AI risks and to engage the entire organization so that it is ready to embrace the power and the responsibility associated with AI. The level of effort required to identify and control for all key risks dramatically exceeds prevailing norms in most organizations. Making real progress demands a multidisciplinary approach involving leaders in the C-suite and across the company; experts in areas ranging from legal and risk to IT, security, and analytics; and managers who can ensure vigilance at the front lines.



In a joint session at the NCSC's CYBERUK 19 conference in Glasgow, the NCSC and the ICO outlined how the two organisations work together and create a better understanding for cyberattack victims who need to contact them with the aim of making it easier to deal with the right one at the right time. "It's important organisations understand what to expect if they suffer a cybersecurity breach. The NCSC has an important role to play in keeping UK organisation safe online, while our role reflects the impact cyber incidents have on the people whose personal data is lost, stolen or compromised," said ICO deputy commissioner for operations, James Dipple-Johnstone. "Organisations need to be clear on the legal requirements when to report these breaches to the ICO, and the potential implications, including sizeable fines, if these requirements aren't followed." In the event of a cyberattack, the NCSC will engage directly with victims to understand the nature of the incident and provide free and confidential advice to help mitigate its impact in the immediate aftermath.


5 Ways AI Is Already Being Used to Transform Business Operations


As machine learning and AI tools are allowed to digest bigger troves of data, an endless swarm of insights is being made available. Many of them can be used to improve existing operations, but there’s more, too. Some of that information can be used to identify and explore all-new opportunities. For example, data related to a particular product might reveal how customers are using the item, particularly in ways that were not originally intended. Extracted insights might also reveal desirable features and functions, which price points are most desirable, or even which additional products and services can be delivered to augment the experience. It’s about a whole lot more than just conventional business operations, however. AI technologies are being deployed in new ways, too. iCertis, for example, is leveraging AI to build smarter static contracts. More specifically, its AI solutions are designed to overcome enterprise contract management challenges through the power of enhanced data capabilities.


Statistically speaking, here’s how your SaaS company can succeed


The good news is that although there are higher expectations, the quality of customer engagement becomes stronger because you have a wealth of data to personalize experiences and customer interactions. And because there are more interaction points with your customers, you can make more informed decisions on how to drive loyalty and where to find more of those loyal customers to drive overall lifetime value. The upfront spend that subscription businesses invest to acquire customers is paid back over time. In order for subscription business to sustainably grow, it’s essential to increase that lifetime value. And over the lifetime of the customer, you’re paying back that customer acquisition cost until you reach ‘economic loyalty,’ earning back a multiple return on the cost of acquiring and serving your customers. “That’s the ability to optimize how you monetize through pricing and plan structures,” Clark explains. “Price optimization is one of the few ways where you can increase revenue, increasing lifetime value from your subscribers without also correspondingly increasing your cost of acquisition or your cost of goods and services.”


Uncovering the hidden talent on your staff


Tech companies have led the way in fostering creativity by using an agile approach. Teams break projects down into sprints, focusing intensely on solving one specific problem at a time in a short, set period. This can lead to quick, outside-the-box thinking and risk-taking, and gives staff the ability to pivot to new ideas as short-term findings become clear. And, as with hackathons, these methods can be adapted to work in any industry. Experimenting with an agile approach — and not just in your IT department — can reveal which individuals on your staff have the right mind-set for innovation and are most likely to be able to learn and adapt with your company’s needs.... Another way to find the innovators hiding in plain sight at your company is to create teams tasked specifically with coming up with new ideas. This responsibility isn’t in most employees’ job descriptions, so they might not be prioritizing it. But you can create formal innovation programs, or even tie a pilot to an existing project, to give employees the time and space they need to show what they’re capable of.


Bringing cloud services to the edge

Businessman monitoring through telescope stands on arrow above clouds © alphaspirit - shutterstock
What cloud users haven’t been able to overcome are the physical limitations imposed by centralized infrastructure, particularly the delays imposed by transporting data hundreds or thousands of miles between an application user and the application infrastructure provider. While these seemed minute when people were already accustomed to waiting seconds on database transactions in a Web form, they become significant in an era of 5G wireless connectivity delivering streaming media and interactive multiplayer games. The speed of light quickens for no one, not even Einstein, meaning that data cannot travel 1,000 miles faster than about 5ms; add in the latency of the network equipment along the way and the round-trip time, even to the nearest regional cloud location becomes noticeable, if not intolerable for many applications. The curse of latency is a primary argument for edge computing, but it’s not the only one as companies have long known the benefits of offloading popular content and workloads to geographically distributed locations closer to users.



CIOs can make the most of artificial intelligence by applying it to strategic digital business objectives. Artificial intelligence (AI) can augment or automate decisions and tasks today performed by humans, making it indispensable for digital business transformation. With AI, organizations can reduce labor costs, generate new business models, and improve processes or customer service. However, most AI technologies remain immature. “To overcome this hurdle, CIOs must ensure that applications intended to serve a strategic business purpose, such as increasing revenue or scaling services, are designed for strategic plans,” says Jorge Lopez, Distinguished Vice President Analyst, Gartner. Lopez outlines six design principles that will help CIOs and organizations evaluate all proposed AI applications with strategic intent — that is, applications intended to help achieve business results, not just operational improvements. Applications do not have to follow all six principles; however, designs that show two or fewer principles should be reconsidered.


Three out of five IT workers share sensitive information by email


The report shows that the current digital landscape within the companies surveyed is not meeting the needs of employees. And compared to last year, some problems are even getting worse. Over a quarter (28 percent) use instant messaging to share sensitive or private information, creating a major security risk for enterprises. Two thirds of tech workers (66 percent) use use non-approved communication apps because they are less likely to be monitored or tracked. Secure methods that track user access and support the use of watermarks are rarely used. Unsanctioned apps and software, referred to as shadow IT, can lead to information being shared on unsecured systems, and cause communication to be fractured and siloed in the enterprise. These communication challenges go beyond security risks — the report also highlighted restrictions in knowledge-sharing. Almost three-quarters (72 percent) of IT professionals said that they work remotely at least one time per week. Over two thirds (68 percent) say remote work presents challenges that could be solved by better technology solutions, and a digitally centric work culture.


Continuous architecture combats dictatorial EA practices

To encourage an Agile enterprise architecture, software teams must devise a method to get bottom-up input and enforce consistency. Apply tenets of continuous integration and continuous delivery all the way to planning and architecture. With a dynamic roadmap, an organization can change its planning from an annual endeavor to a practically nonstop effort. Lufthansa Systems, a software and IT service provider for the airline industry under parent company Lufthansa, devised a layered approach to push customer demand into product architecture planning. Now, the company can continuously update and improve products, said George Lewe, who manages the company's roster of Atlassian tools that underpin the multi-team collaboration. "We get much more input from the customers -- really cool ideas," Lewe said. "Some requests might not fit into our product strategy or, for technical reasons, it's not possible, but we can look at all of them." Lufthansa Systems moved its support agents, product managers and software developers onto Atlassian Jira, a project tracking tool, with a tiered concept.



Quote for the day:


"Be willing to make decisions. That's the most important quality in a good leader." -- General George S. Patton, Jr.


Daily Tech Digest - April 27, 2019

Google Sensorvault Database Draws Congressional Scrutiny

Google Sensorvault Database Draws Congressional Scrutiny
In a letter to Google CEO Sundar Pichai, the Democratic and Republican leaders of the House Energy and Commerce Committee have posed 10 questions about Sensorvault and what information Google has collected about users over the past decade. The committee's leaders want to know what Google does with all this data, who can access it and whether consumers are protected in cases of mistaken identification that may result from police investigations. Because Google uses geolocation data to sell targeted advertising to consumers, Sensorvault is also raising questions about privacy and the personal information that companies collect. "The potential ramifications for consumer privacy are far reaching and concerning when examining the purposes for the Sensorvault database and how precise location information could be shared with third parties," according to the letter. "We would like to know the purposes for which Google maintains the Sensorvault database and the extent to which Google shares precise location information from this database with third parties."



Designing Bulletproof Code

There is no doubt about the benefits good coding practices bring, such as clean code, easy maintaining, and a fluent API. However, do best practices help with data integrity? This discussion came up, mainly, with new storage technologies, such as the NoSQL database, that do not have native validation that the developer usually faces when working to SQL schema. A good topic that covers clean code is whose objects expose behavior and hide data, which differs from structured programming. The goal of this post is to explain the benefits of using a rich model against the anemic model to get data integrity and bulletproof code. As mentioned above, an ethical code design also has an advantage in performance, once it saves requests to a database that has schema to check the validation. Thus, the code is agnostic of a database so that it will work in any database. However, to follow these principles, it requires both good OOP concepts and the business rules to make also a ubiquitous language.


AI Is Destroying Traditional Business Thinking

Automobiles--products from an asset-intensive industry--awaiting export
The trend is absolutely clear, and the economics behind it indicate that this isn’t just a short-term trend. Today’s AI and platform-driven economics have a clear advantage over the economies of scale of the prior age, just as the industrial age was faster, better and cheaper in creating value than the agricultural economy. In fact, the sheer market dominance of platform and AI powered organizations are fast becoming a threat to competitive capitalism, so much so that President Donald Trump—long known for advocating for more coal mining--finally agreed that AI needs to be a pillar of his economic policy to keep pace with China’s commitment to this reality. The shifts in technology capabilities and capital allocation are taking a bite out of the building blocks that made the industrial revolution so powerful and resolute. However, the time has come for every company to move beyond the old thinking, acting, measuring and investing that underpins yesterday’s economies. The health of the global market depends on updating our underlying measurement systems, business models, and technologies. It’s time to overhaul our business and management approaches to reflect today’s realities.


Test Automation: Prevention or Cure?

At first, the automation helped a lot as we could now quickly and reliably run through simple scenarios and get the fast feedback we wanted. But as time went on, and after the first initial set of bugs were caught, it started to find less and less issues unless we actually encoded the automated test cases to look for them. We also noticed issues were still getting through because for some scenarios we just couldn’t automate; for example, anything related to usability had to be tested manually. So we ended up with a hybrid solution where the automation would run some of the key scenarios quickly e.g. letting the team know they hadn’t broken anything obvious and exploratory testing for any new functionality, which in turn could be automated if suitable. As such, it is difficult to test; we were prone to making mistakes while attempting to test it or it simply took too long to do manually. An unexpected benefit indirectly linked to our automation journey was that as we started to release faster, it created a stronger focus on what we were trying to achieve.



The tool, known as Exercise in a Box, has been tested by government, small businesses and the emergency services and aims to help organisations in the public sector and beyond to prepare and to defend against hacking threats. "This new free, online tool will be critical in toughening the cyber defences of small businesses, local government, and other public and private sector organisations," said Cabinet Office Minister David Lidington, who revealed the tool in a speech in Glasgow, Scotland at CYBERUK 19, the NCSC's cybersecurity conference. ... Exercise in a Box provides a number of scenarios based on common threats to the UK that organisations can practice in a safe environment. It comes with two different areas of exercise – technical simulation and table-top discussion. It's hoped that this tool will provide a stepping stone towards the world of cyber exercises. "The NCSC considers exercising to be one of the most cost-effective ways an organisation can test how it responds to cyber incidents," said Ciaran Martin, CEO of the NCSC.


Who hires the CDO — or does the chief data officer hire themselves?

Who hires the CDO — or does the chief data officer hire themselves? image
In some circumstances, those working in the data domain, or even other C-level executives will put themselves forward as potential CDOs. As it is still a relatively new role, this means candidates are either making a case for themselves when a CDO position has opened up, or are making the case for the company to invest in a CDO position and put themselves forward for this position. For PATH, a global not-for-profit health equity organisation, the CDO position was something that was thought-up in a conversation between the CEO and the CDO-to be. “It was a discussion with the CEO which wasn’t a difficult one – he was also reading the same Gartner reports that I was reading and recognising that as an organisation we really needed to treat our data as an asset and recognised that we had no one at an institutional level championing that effort,” says PATH CDO Jeff Bernson. This meant that Bernson’s role had to be reframed into what is now known as a chief data officer role – meaning he would put more of a focus on data.


Fine line between AI becoming a buzzword and working in healthcare explored

Fine line between AI becoming a buzzword and working in healthcare explored
Dr Mark Davies, IBM Watson’s chief medical officer for Europe, said the healthcare sector was “behind the curve” in using AI but we need to use it “appropriately” for it to work. He told Digital Health News the sector needed to overcome certain barriers before it can take full advantage of the benefits of AI. “Number one is access to data. AI is best when it is fed with large amounts of good quality, contemporaneous data,” he said. “We all know that getting access to good quality data globally can be challenging. “The second challenge is culture. In healthcare we can be quite slow adopters of innovation, for all sorts of reasons. “Some of that has to do with professional confidence, some of that has to do with our ability to change work practices. We tend to be quite conservative as an industry, for really good reasons… clinical safety and not doing harm is at the core of what we do. “The third is around demonstrating the impact that it has and building the evidence base in a way that it is scientifically robust so it can go through the regulatory steps.”


Intelligence Agencies Seek Fast Cyber Threat Dissemination

Intelligence officials said getting the right information into the right hands as quickly as possible is mandatory for battling online attacks. "One of the focus areas for NSA is not just the speed but the classification," Rob Joyce, senior adviser for cybersecurity strategy to the director of the U.S. National Security Agency, said during the panel, gesturing to the NCSC's Martin. "I can give Ciaran some very valuable information at the classified level, very, fast and very easy. But if it turns out that's needed in the critical infrastructure of a commercial company in the U.K., I haven't helped him a lot by handing it to him at that highest classification level." Less classification - or declassifying information altogether - can make it more useful. "Getting it ... unclassified at actionable levels and down to actionable levels is really the area that's going to pay the most dividends," Joyce said. "Exquisite intelligence that's not used is completely worthless."


How data storage will shift to blockchain

How data storage will shift to blockchain
“Blockchain will disrupt data storage,” says BlockApps separately. The blockchain backend platform provider says advantages to this new generation of storage include that decentralizing data provides more security and privacy. That's due in part because it's harder to hack than traditional centralized storage. That the files are spread piecemeal among nodes, conceivably all over the world, , makes it impossible for even the participating node to view the contents of the complete file, it says. Sharding, which is the term for the breaking apart and node-spreading of the actual data, is secured through keys. Markets can award token coins for mining, and coins can be spent to gain storage. Excess storage can even be sold. And cryptocurrencies have been started to “incentivize usage and to create a market for buying and selling decentralized storage,” BlockApps explains. The final parts of this new storage mix are that lost files are minimized because data can be duplicated simply — the data sets, for example, can be stored multiple times for error correction — and costs are reduced due to efficiencies.


Load Balancing Search Traffic at Algolia With NGINX and OpenResty

An Algolia "app" has a 3 server cluster and Distributed Search Network (DSN) servers that serve search queries. DSNs are similar in function to Content Delivery Network Points-of-Presence (POPs) in that they serve data from a location closest to the user. Each app has a DNS record. Algolia's DNS configuration uses multiple top-level domains (TLDs) and two DNS providers for resiliency. Also, each app's DNS record is configured to return IP addresses of the 3 cluster servers in a round robin fashion. This was an attempt to distribute the load across all servers in a cluster. The common use case for the search cluster is through a frontend or mobile application. However, some customers have backend applications that hit the search APIs. The latter case creates an uneven load as all requests will arrive at the same server until a particular server’s DNS time-to-live (TTL) expires. One of Algolia’s apps suffered slow search queries during a Black Friday when there was heavy search load. This led to unequal distribution of queries.



Quote for the day:


"Managers maintain an efficient status quo while leaders attack the status quo to create something new." -- Orrin Woodward


Daily Tech Digest - April 24, 2019

Edge computing is in most industries’ future

istock 1019389496
The growth of edge computing is about to take a huge leap. Right now, companies are generating about 10% of their data outside a traditional data center or cloud. But within the next six years, that will increase to 75%, according to Gartner. That’s largely down to the need to process data emanating from devices, such as Internet of Things (IoT) sensors. Early adopters include: Manufacturers: Devices and sensors seem endemic to this industry, so it’s no surprise to see the need to find faster processing methods for the data produced. A recent Automation World survey found that 43% of manufacturers have deployed edge projects. Most popular use cases have included production/manufacturing data analysis and equipment data analytics; Retailers: Like most industries deeply affected by the need to digitize operations, retailers are being forced to innovate their customer experiences. To that end, these organizations are “investing aggressively in compute power located closer to the buyer,” writes Dave Johnson, executive vice president of the IT division at Schneider Electric. 


Why fintech is the sharing economy’s final frontier

Fintech through sharing economy applications threatens to break up the banking complex as we know it – all through the social capital and economic sharing. Evidently, this is where investors feel the sharing economy has the most potential. Fintech is already attracting huge amounts of investment and is the biggest sector in terms of Venture Capital investments. Peer-to-peer lending, equity crowdfunding, and payment possibilities are proving to be the three areas with the biggest potential for fintech through the sharing economy start to come into full focus. Firstly, equity crowdfunding helps to create a two-sided marketplace between investors and startups. This means a slice of the private company is sold for capital, typically through the sale of securities like shares, convertible note, debt, or a revenue share. This process is similar to crowdfunding or Kickstarter campaigns but with possible payouts for those willing to put their money where their mouth is. 


intel 9th gen logo
For now, Intel’s 9th-gen mobile chips are targeting the sort of dazzing, high-end gaming notebooks most of us unfortunately can’t afford or can’t bear to lug around. If that’s the case, be patient: Separately, Intel announced a metric ton of new 9th-gen desktop processors, about six months after it announced its own Core i9-9900K. Many of the more mainstream mobile 9th-gen Core chips will likely debut in late summer or fall. Until then, read on to get an idea of what to expect. ... The new 9th-gen Core chips are designed with Intel’s “300-series” mobile chipsets in mind: the Intel CM246, Intel QM370, or Intel HM370 chipsets. A key performance advantage are the x16 channels of PCI Express 3.0 directly off of the CPU, which provide enough bandwidth for an upcoming generation of third-party discrete GPUs. A full 128GB of DDR4 memory isn’t anything to sneeze at, either. The new 9th-gen mobile Core chips also ship with integrated graphics, part of what Intel calls “Generation 9.5.”


Boosting data strategies by combining cloud and infrastructure management

By receiving insights into IT operations including hardware, software, and network environments, these tools support daily operations through real-time monitoring, reduce downtime, and maintain business productivity. The current increase in integrating next generation technologies such as machine learning and artificial intelligence is positioning infrastructure management as an attractive choice for IT teams. One major benefit infrastructure management gives data center managers is the ability to monitor all aspects of IT operations. Through an intuitive platform that allows you to tap into the full value of existing infrastructure, these platforms fuel modernization through intelligent software by providing complete visibility and control over the environment. For example, companies using infrastructure management tools see up to a 40 percent increase in operating efficiencies. From these insights, operators can take control of power consumption, real-time data center health, and preventative analytics.


The FBI's RAT: Blocking Fraudulent Wire Transfers

The FBI's RAT: Blocking Fraudulent Wire Transfers
As much as it might seem like fighting internet crime is like pushing the tide with a broom, there is a bright spot in the gloom. In February 2018, the IC3 created what it terms the RAT, or Recovery Asset Team. Its goal is to contact financial institutions quickly to freeze suspicious pending wire transfers before they're final. Much internet-enabled crime eventually intersects with banking systems. So while it may be difficult to prevent scams, there is a touch point where with industrywide cooperation, stolen funds can be recovered. But time is tight, and swiftly contacting financial institutions is key to stopping stolen funds from being withdrawn.IC3 reports that the bureau's RAT group - working with what's termed the Domestic Financial Fraud Kill Chain - handled 1,061 incidents between its launch and the end of last year, covering an 11-month period. Those incidents caused losses of more than $257 million. Of that, the RAT achieved a laudable 75 percent recovery rate, or more than $192 million.


GraphQL: Core Features, Architecture, Pros, and Cons


A GraphQL server provides a client with a predefined schema — a model of the data that can be requested from the server. In other words, the schema serves as a middle ground between the client and the server while defining how to access the data. Written down in Schema Definition Language (SDL), basic components of a GraphQL schema — types — describe kinds of object that can be queried on that server and the fields they have. The schema defines what queries are allowed to be made, what types of data can be fetched, and the relationships between these types. You can create a GraphQL schema and build an interface around it with any programming language. Having the schema before querying, a client can validate their query against it to make sure the server will be able to respond to the query. While the shape of a GraphQL query closely matches the result, you can predict what will be returned. This eliminates such unwelcome surprises as unavailable data or a wrong structure.


Why composable infrastructure goes hand in hand with private cloud


"With composable infrastructure, I can have a physical server dynamically provisioned in the size and shape I need and have it all stitched together. I'm actually getting those physical server assets to use," said Mike Matchett, storage analyst and founder of Small World Big Data, an IT consultancy near Boston. The composable model differs from converged and hyper-converged systems. Converged infrastructure is sold as racks of prequalified hardware from multiple vendors and includes systems such as NetApp FlexPod and Dell EMC VxBlock. Hyper-converged systems ship all the necessary components -- servers, hypervisor software, network connectivity and storage -- delivered as an integrated appliance. While swapping in composable modules sounds appealing, Matchett said enterprises should methodically estimate the financial cost before taking the plunge. "Without a significant churn rate for the resources, I'm not sure composable makes much fiscal sense," he said. The cost made sense for Clearsense, based in Jacksonville, Fla.


Dark Side of Offshore Software Development

Time is money, especially if you are working against the clock to get your product on the market before the competition, as most startups do. A week’s delay may not kill your business, but few projects can afford to lose a month or more. However, offshore development is not the reason behind missed deadlines. In-house teams can fail to finish projects on time, and the expenses will be even higher. ... One of the problems of outsourcing is that even if the project progresses on schedule, it is a hostage to an offshore vendor. Agile development may provide you with interim results and functionality, but you won’t receive the full package until the vendor delivers it. If you are in a hurry to get the product on the market, you are ready to give the shirt off your back. Unscrupulous companies are willing to risk reputation and future business to squeeze you dry.


What Is Explainable Artificial Intelligence and Is It Needed?

What Is Explainable Artificial Intelligence and Is It Needed?
It is aimed to explain the reasons for new machine/deep learning systems, to determine their strengths and weaknesses and to understand how to behave in the future. The strategy to achieve this goal is to develop new or modified artificial learning techniques that will produce more definable models. These models are intended to be combined with state-of-the-art human-computer interactive interface techniques, which can convert models into understandable and useful explanation dialogs for the end user. ... “XAI is one of a handful of current DARPA programs expected to enable -the third-wave AI systems- where machines understand the context and environment in which they operate, and over time build underlying explanatory models that allow them to characterize real-world phenomena.” If we set out from medical practice, after examining the patient data, both the physician should understand and explain to the patient that he proposed to the concerned patient the risk of a heart attack on the recommendation of the decision support system.


How AI could save the environment

istock-925065874.jpg
Google has also used its own AI expertise to improve its energy efficiency as a company—leveraging DeepMind's machine learning capabilities, it reduced the amount of energy needed to cool its data centers by 40%. "AI is most helpful when the possible solution to a problem resides in large, highly dimensional datasets," Pucell said. "If you think about climate data, there's a wealth of traditional structured data about temperature, sea levels, emissions levels, etc. But there's also a lot of unstructured data in the form of images, video, audio, and text. When it comes to analyzing massive amounts of unstructured data, deep learning is really the only game in town." At USC, Dilkina's research group has used AI to develop optimization methods for wildlife conservation planning—an area where highly limited budgets need to be allocated to protect the most ecologically effective land, she said. Her team has also used machine learning and game theory to help protected areas fight the poaching of endangered animals, including elephants and rhinos.



Quote for the day:


"Nobody in your organization will be able to sustain a level of motivation higher than you have as their leader." -- Danny Cox


Daily Tech Digest - April 23, 2019

How and where to use serverless computing

cloud comput connect blue
Serverless has a great attraction not just for application developers, but also for systems operations personnel, says Ken Corless, principal in the cloud practice at Deloitte Consulting. Whether offered by hyperscale cloud providers or implemented on-premise with various solutions on the market, the goal of serverless computing is the same: “ruthless automation and self-service to speed the software development lifecycle,” Corless says. For IT administrators, serverless reduces the “request-response” cycle of ticket-based workloads and allows administrators to focus on higher-level tasks such as infrastructure design or creating more automation, Corless says. There are two main use cases that Corless is seeing. One is in application development for creating modern, loosely coupled services-based applications. Both function-as-a-service (FaaS) and backend-as-a-service (BaaS)—two cloud-enabled services that achieve serverless computing—can dramatically improve the productivity of a software delivery team by keeping teams small, he says.


container orchestration service
Having your application in well packaged stand-alone containers is the first step in managing multiple microservices. The next is managing all of the containers, allowing them to share resources and setting configuration for scaling. This is where a container orchestration platform comes in. When you use a container orchestration tool, you typically describe the configuration in a YAML or JSON file. This file will specify where to download the container image, how networking between containers should be handled, how to handle storage, where to push logs, and any other necessary configuration. Since this configuration lives in a text file you can add it to source control to track changes and easily include it in your CI/CD pipeline. Once containers are configured and deployed, the orchestration tool manages its lifecycle. This includes starting and stopping a service, scaling it up or down through launching replicas and restarting containers after failure. This greatly simplifies the amount of management needed.


5 Ways to Get the Most Out of Your Design Team

Building a team of solid design practitioners is just the first step towards building a successful design practice. Leaders who can create and share a vision, inspire their team and the rest of the organization, and advocate for design at the executive level are essential. Nearly two-thirds of the Level 5 companies in our study have teams led by design leaders who are directors and above, who likely have greater influence with executives, more accountability, and who are better positioned to develop strong partnerships with leaders in other functions. Ultimately, this means that the more senior the design leadership, the greater the impact on the bottom line. When compared to the Level 1 companies in our study, design leaders at the Level 5 organizations were nearly three times more likely to be involved in critical business decisions and to be peers with their counterparts in engineering and product management. They were four times more likely to own and develop key product and features with key partners, and nearly twice as likely to report directly to the CEO — underscoring the importance of empowering one’s design team within the context of the larger product team.


Processing the application in a cloud environment would require transferring all the data readings across the network. Processing at the edge, however, would eliminate the need to transfer those readings. The link between the edge and the cloud would carry only periodic reports, so it would cost less than a link that carried a constant flow of high-volume data. The tradeoff would be the continuing cost of a communication link versus the cost of locating and maintaining a processor at the edge. Processors have continued to fall in price, so edge computing will be less expensive than cloud computing in many cases. But each application is different, and organizations must carefully study before choosing an option. ... An edge computing facility can be located within a warehouse, refinery or retail store, but other options are also available. In the past few years, micro modular data centers (MMDCs) have grown in popularity. An MMDC is a complete computing facility in a box, and it contains everything included in a data center: processors, network, power, cooling and fire suppression, as well as protection from electromagnetic interference, shock and vibration.



“The way we work is changing,” said Jonathan Christensen, chief experience officer at Symphony. “Collaboration platforms and other innovations bring positive improvements that enable more flexibility and better work-life balance. But a more casual approach to workplace communications, and digital habits in general, presents major security risks. “Employees won’t keep secure practices on their own, and employers must consider how they will secure workforce communication over messaging and collaboration tools, just like they did with email.” The research also uncovered other trends that put employers at risk, including the fact that 51% of respondents are using collaboration platforms to discuss social plans, 44% are sharing memes and photos, and 18% admitted using them to ask a co-worker out on a date. This ease of communication poses a danger of creating a casual attitude to workplace communications, said the study report, because 29% admitted talking badly about a client or customer, 19% had shared a password


WannaCry Stopper Pleads Guilty to Writing Banking Malware

Hutchins, a British national, was arrested by the FBI in the U.S. and charged on Aug. 2, 2017, just before he was set to fly back to the U.K. after attending the Black Hat and Def Con security conferences. He has remained in the U.S. since then, continuing to work for Los Angeles-based Kryptos Logic, a security consultancy, where he specializes in reversing malware. Hutchins' guilty plea was filed Friday in federal court in Wisconsin. Hutchins pleaded guilty to two counts of developing and distributing malicious software aimed at collecting data that would aid in fraudulently compromising bank accounts. Each count carries a maximum penalty of five years in prison, a $250,000 fine and one year of supervised release. Hutchins could also be subject to a restitution order. As a result of his guilty plea, prosecutors have agreed to drop eight other counts against him that were lodged in a superseding indictment.


Open architecture and open source – The new wave for SD-WAN?

hello my name is open source nametag
Cloud providers and enterprises have discovered that 90% of the user experience and security problems arise due to the network: between where the cloud provider resides and where the end-user consumes the application. Therefore, both cloud providers and large enterprise with digital strategies are focusing on building their solutions based on open source stacks. Having a viable open source SD-WAN solution is the next step in the SD-WAN evolution, where it moves to involve the community in the solution. This is similar to what happens with containers and tools. Now, since we’re in 2019, are we going to witness a new era of SD-WAN? Are we moving to the open architecture with an open source SD-WAN solution? An open architecture should be the core of the SD-WAN infrastructure, where additional technologies are integrated inside the SD-WAN solution and not only complementary VNFs. There is an interface and native APIs that allow you to integrate logic into the router. This way, the router will be able to intercept and act according to the traffic.


How 5G could shape the future of banking

“5G will begin to unlock some interesting things that we've dreamt about or imagined for some time,” said Venturo, the chief innovation officer at U.S. Bank. “5G is exponentially more powerful than 4G; it has such low latency and such high bandwidth that for a lot of applications, it will make a lot of sense to use 5G instead of Wi-Fi.” Jeremy K. Balkin, head of innovation, retail banking and wealth management at HSBC USA, has a similarly enthusiastic take. “Banking today is an experience that our customers want when, where and how they choose," he said. "The benefits of 5G networks offer next-generation mobile internet connectivity, faster speeds and more reliable connections on smartphones and other devices, which we believe will benefit all consumers.” 5G may help Venturo revive projects in his innovation lab he has had to set aside because network technology could not support them.


Developer Skills for AI

Image title
What AI allows developers to do is things in the code that cannot be done algorithmically. And that is the magic. A certain problem in the AI space, like looking at a photo and determining whether it contains a cat or a dog, has been solved. Five years ago, it was next to impossible to do. Today it's trivial to do. The beauty of AI/ML is the ability to look at a massive data set and find patterns that a human would never find by looking at it. As a developer, we are accustomed to writing code that takes an input applies an algorithm and produces a known output. That's what programming is all about to a large degree. with ML, we turn that on its head a little bit, we don't know the algorithm, we don't even have an algorithm. Instead, we build an ML model, we give it the inputs, and we give it the outputs, the inputs might be here are hundred million credit card transactions that have actually taken place, the outputs are the ones that were fraudulent.


Debunking the Discourse Around Cloud Security

The apprehension with which cloud safety is met is nothing short of ironic given the fact that in many ways, the cloud is actually more secure than on-premise, largely due to cloud providers collectively investing more into security controls than businesses can on their own. Take AWS for example. Its infrastructure hasn't been hacked in years, which is very good going, especially considering attackers are constantly targeting its infrastructure. So what’s the key to cloud security? First things first, choose established public cloud players that you can rely on. They often have the resources and expertise to protect their own infrastructure. However, in terms of education, what we need is a complete paradigm shift when it comes to cloud safety, the key to which lies in understanding the shared responsibility model. The shared responsibility model is universal among cloud providers: they ensure their infrastructure will be secure, but customers need to ensure whatever they do in the cloud is secure.



Quote for the day:


"You may be good. You may even be better than everyone esle. But without a coach you will never be as good as you could be." -- Andy Stanley