Daily Tech Digest - September 11, 2018

Protecting PHI on Devices: Essential Steps

Protecting PHI on Devices: Essential Steps
"Anyone with physical access to electronic computing devices and media, including malicious actors, potentially has the ability to change configurations, install malicious programs, change information or access sensitive information," the Department of Health and Human Services notes in September edition of its monthly e-newsletter advisory. While HIPAA requires covered entities and business associates to limit physical access to their electronic information systems and the facilities in which they are housed, organizations are also required to implement policies and procedures that govern the receipt and removal of hardware and electronic media containing electronic PHI into and out of an organization's facilities, as well as their movement within a facility, the HHS Office for Civil Rights notes. Implementing processes to govern the movement of electronic devices and media may vary depending on the type of device and media, the agency states. ... Organizations can use various methods to govern and track the movement of electronic devices and media, the agency explains.


5 top strategies to make development cycles more efficient

Transparency and project visibility is important. Both are required for teams to learn, research and implement changes. Changes could only be responded to if everyone on the team, from development to administration, are on the same page. Daily and weekly meetings are structured within larger time frames, sometimes referred to as sprints. These time frames define overarching goals. If implemented at regular intervals these time periods, or sprints, can be a great time to reframe, reassess, and retune the development process. It is important in modern software development environments to keep a steady, methodical pace. Only through continual communication and review is this possible. The needs of the client, customer or user need to come before implementing tools. Often we fall in love with certain tools and processes within software development, but it is important to understand that these tools and processes are only as useful insofar as they help empower our clients, customers and users. Anything that hinders customer satisfaction must be cut out.


The use cases, challenges and benefits behind retail AI


Decades on, retailers now collect so much customer data from so many different people that it is impossible to offer this personalised service without the help of technology. As pointed out by Brian Kalms, partner and retail lead at consultancy Elixirr, some retailers have so much data there is no longer the option of analysing it by hand, especially when adding new online ventures into the mix.  “Historically, when you went into a store, you didn’t identify yourself,” he says. “Online brands know who you are, so retailers are going to have to learn to be data-savvy, and that’s one of the first applications of AI – it’s been in the form of bots and communications, and it’s moving into data analysis.” Where retailers used to categorise their customers in a “simplistic way”, now data can be used to better understand customers individually. For example, using old customer demographics based on socio-economic background, earnings and gender, a consumer who buys high-end food but “value” tissues should not exist, be we know this isn’t the case.


5 Tips for Integrating Security Best Practices into Your Cloud Strategy

Agility, resilience, and speed are baked into the development of every cloud implementation; they are why organizations adopt cloud-first strategies. But without the proper tools, sys admins can't effectively manage and protect their evolving cloud landscape, negating these benefits. As you plan your cloud strategy, the right tools and a detailed road map are essential for supporting a successful transition. Start by assuming that at some point, if not already, some of your workload will move to the public cloud, so you'll really be managing a hybrid environment. Next, it's highly like that the people supporting your data center will also support your cloud, so to avoid misconfigurations and minimize complexity, adopt management and security solutions that support hybrid cloud scenarios. It's also likely your environment will evolve to include more than one cloud service. Whether through a merger or acquisition, adopted in a development lab or acquired elsewhere, you may be faced with a combination of Microsoft Azure, Amazon Web Services, and/or Google cloud environments. 


Why your company needs an open source program office

developers.jpg
We seem to be very confused about what constitutes an "open source company." Tobie Langel has asked if Mozilla and Microsoft are open source companies. The majority (78%) think Mozilla is, and an almost equivalent percentage (67%) think Microsoft is not. Yet, Microsoft contributes orders of magnitude more open source code than Mozilla. The reality is that both organizations qualify as "open source companies." Hopefully yours does, too. ... It would be tempting to think that an open source program office is a lagging indicator of open source activity, but in my experience it's actually a leading indicator, and a causal factor. As my colleagues Fil Maj and Steve Gill presented at the recent Open Source Summit in Vancouver, a good open source program can remove roadblocks to participation. We've seen Adobe go from a top-32 contributor (based on active GitHub contributors) to top-16 in a year, with a refactored contribution process a major driver to that change.


Standard to protect against BGP hijack attacks gets first official draft

Back in October 2017, two US government agencies, the aforementioned NIST and the Department of Homeland Security (DHS) Science and Technology Directorate, started a joint project named Secure Inter-Domain Routing (SIDR) with the explicit purpose of securing the BGP protocol from such attacks. "The overall defensive effort will use cryptographic methods to ensure routing data travels along an authorized path between networks," the NCCoE at NIST said in a press release at the time. "There are three essential components of the IETF SIDR effort: The first, Resource Public Key Infrastructure (RPKI), provides a way for a holder of a block of internet addresses--typically a company or cloud service provider--to stipulate which networks can announce a direct connection to their address block; the second, BGP Origin Validation, allows routers to use RPKI information to filter out unauthorized BGP route announcements, eliminating the ability of malicious parties to easily hijack routes to specific destinations.


How the TOGAF Standard Serves Enterprise Architecture


Businesses thrive off change to deliver new products and services to earn revenue and stay relevant. But throughout a business’s lifetime, new systems are created, mergers require system integration or consolidation, new technologies are adopted for a competitive edge, and more systems need integration to share information. A well-defined and governed EA practice is critically important at an organizational level to confront, handle and manage these technological and computing complexities. Without an EA practice, there could be disconnects between systems, inconsistencies in solutions, miscommunications among product and engineering teams, duplication of engineering efforts and erosion of an organization’s architecture and solution quality. Let us use a product startup company as an example. This startup could experience surging growth, rapidly advancing from nascent to emergent. The business and technology are experiencing brisk change. Without an EA practice, or at least some level of architecture guidance, the startup may quickly find its systems in disparity and unable to share information.


How the industry expects to secure information in a quantum world

QLabs is focusing in particular on applications in cybersecurity and communications and scooping up funding from the Australian government to help it do that at a Defence-grade level. Today, commercial exchange of information is protected primarily via public key infrastructure (PKI), with the security of PKI reliant on the computational complexity of certain mathematical operations. Sharma said that essentially, the system is reliant on mathematical problems that are easy to do one way, but difficult to reverse in order to decrypt -- and that's what cybersecurity currently relies on. One such system used for PKI exchange is an RSA algorithm. "The mathematics of the RSA key exchange will be broken once we have a quantum computer because it will be able to do the reverse calculation much faster than we can with conventional computers, even supercomputers," Sharma explained. "That's where the threat arises ... when we look forward we need to recognise that certainly within the next decade, most people would contend, that we'd have a quantum computer available at a useful scale.


The deep-learning revolution: How understanding the brain will let us supercharge AI

Sejnowski compares the neural networks of today to the early steam engines developed by the engineer James Watt at the dawn of the Industrial Age - remarkable tools that we know work but are uncertain how. "This is exactly what happened in the steam engines. 'My god, we've got this artifact that works. There must be some explanation for why it works and some way to understand it'. "There's a tremendous amount of theoretical mathematical exploration occurring to really try to understand and build a theory for a deep learning." If research into deep learning follows the same trajectory as that spurred by the steam engine, Sejnowski predicts society is at the start of journey of discovery that will prove transformative, citing how the first steam engines "attracted the attention of the physicists and mathematicians who developed a theory called thermodynamics, which then allowed them to improve the performance of the steam engine, and led to many innovative improvements that continued over the next hundred years, that led to these massive steam engines that pulled trains across the continent."


Digital Payments Security: Lessons From Canada

Canada, which has a head start on the adoption of digital payments, has learned some valuable security lessons that could be beneficial to the U.S., says Gord Jamieson of Visa. "If we look at Canada itself as a market, we're probably one of the leading countries when it comes to the adoption and usage of digital payments," says Jamieson, head of payment system risk for Visa Canada. "Seventy percent of our Canadian personal consumption expenditure is conducted using digital payments." ... "We see tokenization as basically the key to addressing fraud within this space," he says. "Tokenization is going to take that account data out of the mix. It's going to be replaced by a token - a proxy value - that today would go through the [payment] rails the same way as a normal transaction for authorization. ... And the beauty of a token is that token is unique to that environment and if it gets compromised, then you simply replace the token."



Quote for the day:


"All leadership takes place through the communication of ideas to the minds of others." -- Charles Cooley


Daily Tech Digest - September 10, 2018

blockchain telecom
Mobile communications are exponentially vital to individuals, to communities, to businesses, and to the wider economy. Research analysis shows that there are more than 2,500 telecom operators across the globe eliminating distances between people through data or voice connectivity. More and more, and for good reasons, the industry is adopting the blockchain technology to take advantage of fascinating applications, new features and capabilities. It is paving a concrete leap in the way to manage transactions compared to the old-fashioned traditional systems. In addition to other carrier members of the ITW Global Leaders’ Forum (GLF) that are involved in blockchain initiatives, blockchain trials (POCS) are successfully carried out by companies such as BT, Telefónica, PCCW, Colt in, Telstra, HGC Global Communications, just to name few of them. Other operators, such as South Korean Telecoms Giant KT, have built their own blockchain. The PoCs (with extensive features) demonstrated that the blockchain technology is reducing drastically the operational costs and eliminating many unnecessary outdated systems while uncovering new business streams with an unprecedented quality in monitoring transactions in real-time.



Education in the Age of Automation

Education in the Age of Automation
Human-to-human interaction as a basis for a meaningful educational experience becomes optional. Today, it’s entirely possible to learn all you need to know to make a good living in the modern world without ever needing to sit in a classroom or interact with a human directly. Let’s face it: You can do a lot of learning from YouTube. As a result, more people are beginning to question the conventional wisdom of spending four years and thousands of dollars to get a bachelor’s degree—particularly now that companies including IBM, Google and Apple no longer require one to get hired. Coincidentally, as companies eliminate the college degree requirement for new hires, we’re seeing significant growth in technical bootcamps. One study reports that this year, 20,000 students will complete a course of study at a coding boot camp and be “job ready.” Considering that the number of students who graduated with a computer science degree from a typical college in 2017 numbered about 93,000, there’s a good case to be made that boot camps are siphoning off a number of students that otherwise would be headed to academia.


5 Artificial Intelligence and Machine Learning Use-Cases for Cybersecurity

Image from Shutterstock
Building resiliency in a growing digital environment complicates the development and advancement of strategic initiatives, given the propensity of underestimating the external abilities of hackers, viruses and weak structural configuration by many enterprises. Reducing the occurrence of incidents and breaches in a technological environment requires strong organizational design, which includes the necessary capabilities and precautions within the infrastructure build. Where risk is prevalent, extrapolation into the potential positive and negative opportunities that effect cross-functional departments can only be properly assessed by running many different scenarios with the data an organization already has. This is where AI can play a critical role augmenting and empowering the existing security resources. Implementing ML to actively monitor and review API calls, external access points and other logs within enterprise systems enables advanced data monitoring and filtering that can be performed around the clock, thereby eliminating some of the human resourcing and security issues.


How to match microservices middleware to your mission


It's also possible to build state-controlled microservices applications by using specific state control tools. Cloud tools, like AWS Step Functions and Azure Logic Apps, provide sequencing and state control within their respective clouds. Middleware tools, like the Redis data/state caching tool, provide similar capabilities for on-premises applications and general cloud applications. Any of these options provide an assembly framework for step-wise microservices processes and maintain the data and state context of transactions through locally stateless processes. Microservices scaling is related to state control. When multiple instances of a microservice might be used, it's necessary to distribute the work first. It's also important to maintain the transaction context over multiple messages in some way. If your business applies state control logic to its overall application, it's not difficult to extend that to multiple instances of the same microservice, but you'll still have to provide load-balancing capabilities.


Human brain stimulated by artificial synaptic device

This electrical synaptic device stimulates the function of synapses in the brain, as the resistance of the tantalum oxide gradually increases or decreases depending on the strength of the electric signals. It has succeeded in overcoming durability limitations of current devices by allowing current control only on one layer of Ta2O5-x. Synapses are areas where axons and dendrites meet. There are hundreds of trillions of synapses in one individual human brain, and these areas are where signals are sent and received. The chemical synapse information transfer system transfers data from the brain, and can handle high-level parallel arithmetic with very little energy. Worldwide research on artificial synaptic devices, which mimic the biological functioning of a synapse, is already underway. The research team also successfully implemented an experiment that identified synapse plasticity; the process of creating, storing and deleting memories, by adjusting the strength of the synapse connection between two neurons.


How to make an old Android phone feel new again

refresh old android phone cover
Old Android phones are everywhere. They're in closets, kitchens, desk drawers — and, yes, even in the pockets of productive business people who (gasp!) haven't bothered to upgrade in a while. Despite the constant marketing to the contrary, mobile devices can remain perfectly capable long after their launch dates. They can, however, start to seem a little slow or behind the curve after a few years of use — and the older and more resource-limited a device is, the more pronounced that effect is likely to be. But wait! Don't abandon hope just yet: A handful of simple steps can make your old Android phone feel new (or at least newer) again. And whether you're still carrying the device around as a daily driver or using it for more creative purposes, every little improvement counts. So summon your inner mechanic and get ready: It's time to give that old Android phone a much-needed tune-up — and a fresh lease on life.


Small business servers: Why and how you can say 'no' to the cloud

dell-t30.jpg
If your plan is to configure a server on-premises in your office or home, how you connect to the internet may be an issue. If you're just accessing some shared files on an internal network, you won't have any special complications. But if you want to use your on-premises server to serve web pages or email (or any other application) to users on the internet, you're going to have to consider the transition of data requests from the internet, through your firewall, and to your on-premise server. You should plan on having a discussion with your ISP. If you're using a consumer ISP, you may not be allowed to send data out via certain ports. At one time, when I was first setting up my web server in my apartment, I found that the local cable company wouldn't let me serve any traffic out of port 80 (the standard port for web pages). That effectively squelched my ability to run a web site, and I wound up having to buy a dedicated line. You'll also need to consider whether you can get a fixed IP address from your ISP, or whether you need to set up some sort of dynamic DNS routing. Additionally, you may have to set up port forwarding and routing on your router to send data to the right machine on your network, particularly if you're serving web pages or email.


Nagios Core monitoring software: lots of plugins, steep learning curve

eye data bits bytes watch
The Nagios Exchange website offers a large selection of plugins for various monitoring/management scenarios. In fact, one of the strengths of Nagios is the availability of an impressive number of plugins which are compiled executables or scripts that check the status of a host or service. Nagios uses the information from plugins to determine the current status of hosts and services on your network. The plugins act as an abstraction layer between the monitoring logic present in the Nagios daemon and the actual services and hosts that are being monitored. The upside of this architecture is that you can monitor just about anything you can think of. The downside is that Nagios has no idea what is being monitored. Its job is to track changes in the state of what is being monitored. Only the plugins themselves know exactly what they're monitoring and how to perform the actual checks. The Nagios Exchange currently has almost 6,000 projects in over 400 different categories.


These 5 “clean code” tips will dramatically improve your productivity


Test, test, test. We know we should always do it, but sometimes we cut corners so we can push the project out faster. But without thorough testing, how will you 100% fully know that the code works? Yes there are very simple pieces of code, but one is always surprised when that crazy edge case comes up that you thought you didn’t need to test for! Do yourself and everyone on your team a favour and regularly test the code you write. You’ll want to test in a coarse to fine style. Start small with unit tests to make sure every small part works on its own. Then slowly start testing the different subsystems together working your way up towards testing the whole new system end to end. Testing in this way allows you to easily track where the system breaks, since you can easily verify each individual component or the small subsystems as the source of any issues. Choose meaningful names. This is what makes code self-documenting. When you read over your old code, you shouldn’t have to look over every little comment and run every small piece of code to figure out what it all does!


These are the warning signs of a fraudulent ICO

If a company website, whitepaper, or project descriptions are also full of nothing more than fluff, buzzwords, and portray a lack of substance, this is a clear warning sign that all may not be as it seems. Another classic warning sign that an ICO may not be everything it promises is also a return on investment (ROI) which seems too good to be true. In the case of Bitconnect, investors were offered a 120 percent return on their investment annually, but the company performed an exit scam and rendered its tokens useless, leaving investors heavily out of pocket. LoopX went dark and vanished with $4.5 million after pulling an exit scam in February, three ventures -- ACChain, Puyin, and BioLifeChain -- launched by Shenzhen Puyin Blockchain have stolen close to $60 million, and CryptoKami, ran by anonymous parties, also ran a fraudulent ICO and closed after raising $12 million. According to Diar, from 2016 to August this year, fraudulent ICOs have been responsible for the theft of close to $100 million.



Quote for the day:


"Blessed are the people whose leaders can look destiny in the eye without flinching but also without attempting to play God" -- Henry Kissinger


Daily Tech Digest - September 09, 2018

Lenovo Partners With Blockchain Platform to Develop Its IoT and AR/VR Hybrid Software

Lenovo Partners With Blockchain Platform to Develop Its IoT and AR/VR Hybrid Software
“The Internet of Things and augmented reality are already changing the way we interact with the world. We are excited to partner with AR titan Lenovo New Vision. I see the combination of AI/AR and IoT revolutionizing the business environment," Credits CEO & founder Igor Chugunov said in the press release. Lenovo New Vision Technology intends to incorporate Credits’ blockchain solution to streamline internal operations and management procedures. “Credits [has] been chosen by Lenovo New Vision Technology thanks to its distinctive technical solutions, such as [a] unique consensus algorithm which consists of dPoS (delegated-proof-of-stake) and BFT (Byzantine Fault Tolerance) features,” says the press release. The blockchain platform that Credits has built is capable of performing up to one million transactions per second, it offers a processing speed of 0.01 seconds, combined with commission rates of as low as $0.001. The extended functionality of Credits’ smart contracts makes it possible to set cycles and create schedules.



The ultimate guide to finding and killing spyware and stalkerware on your smartphone

Our digital selves, more and more, are becoming part of our full identity. The emails we send, the conversations we have over social media -- both private and public -- as well as photos we share, the videos we watch, and the websites we visit all contribute to our digital forms. ... As mobile devices are now a common tool for social interactions, it is not just ad agencies, data miners, and surveillance-hungry powers that want to keep track of us. When a government agency, country, or cybercriminals decide to peek into our digital lives, there are generally ways to prevent them from doing so. Virtual private networks (VPNs), end-to-end encryption and using browsers that do not track user activity are all common methods. Sometimes, however, surveillance is more difficult to detect -- and closer to home. This guide will run through what spyware is, what the warning signs of infection are, and how to remove such pestilence from your mobile devices.


The How-To: Improving Customer Satisfaction With Digital Leadership

The How-To: Improving Customer Satisfaction With Digital Leadership
Hiring the right people and giving them a platform doesn’t automatically make them customer service champions. What does your business think about the digital revolution? What ideas do you have in your company on how to satisfy your customer with digital platforms. Do you have a guideline for every department on how they can use digital tools to improve their service to customers? Having a digital culture shows your employees that your business is serious about implementing digital innovations in serving customers. Zappos uses the 'WOW Philosophy' to go the extra mile in ensuring that there customers are happy. The company doesn’t want to be known as the “shoe company” but as “a great customer service company. Zappos created a fun working environment which leads to workers having fun at and with their tasks, and this robs off on the customer. A solid digital culture is also a way of showing your customers that you’re trying your best as a business to improve customer service.


Industrial Networking Enabling IIoT Communication

Industrial networking is different from networking for the enterprise or networking for consumers. First there is the convergence of Information Technology (IT) and Operational Technology (OT). Important networking considerations include whether to use wired or wireless, how to support mobility (e.g. vehicles, equipment, robots and workers) and how to reconfigure components. Other factors include the lifecycle of the deployments, physical environmental conditions such as those found in mining and agriculture and electromagnetic conditions where interference from machines and equipment can be a problem. Then we have power. Will it be available? Or will devices need to run on local power such as batteries? Second there are the technical requirements. They include network latency and jitter, throughput needs, reliability and availability. The requirements can vary from being relaxed to highly demanding. The network must meet the end-to-end performance requirements for applications deployed both at the edge and in the cloud. 


Design Thinking: Future-proof Yourself from AI


Integrating design thinking and machine learning can give you “super powers” that future-proof whatever career you decide to pursue. To meld these two disciplines together, one must: Understand where and how machine learning can impact your business initiatives. While you won’t need to write machine learning algorithms (though I wouldn’t be surprised given the progress in “Tableau-izing” machine learning), business leaders do need to learn how to “Think like a data scientist” in order understand how machine learning can optimize key operational processes, reduce security and regulatory risks, uncover new monetization opportunities; Understand how design thinking techniques, concepts and tools can create a more compelling and emphatic user experience with a “delightful” user engagement through superior insights into your customers’ usage objectives, operating environment and impediments to success. Let’s jump into the specifics about what business leaders need to know about integrating design thinking and machine learning in order to provide lifetime job security (my career adviser bill will be in the mail)!


The Pentagon is investing $2 billion into artificial intelligence

pentagon FILE
Artificial intelligence, which lets machines perform tasks traditionally done by humans, is a trendy topic in technology and business circles. For example, Google recently delighted and alarmed observers when it showed how an AI system could call a restaurant and book a reservation while sounding entirely human. Breakthroughs in the last decade have inspired companies to recruit top AI talent away from academia. Machines are now much more accurate at recognizing speech, understanding images and processing words, leading to products such as Amazon's Alexa, Apple's Siri, and Waymo's self-driving vans. The country's biggest and most innovative companies rely on it to stay ahead of competitors.Waymo's autonomous vehicles have driven more than 9 million miles on US roads thanks to artificial intelligence. National governments, such as Canada, China, India and France, are prioritizing AI now too. They view artificial intelligence as essential to growing their economies in the 21st century. Most notably, China has said it wants to be the global leader by 2030.


How corporations and startups can more effectively work with one another

People Rowing Boat In Water
If the corporate posture of the past around innovation could be described as “not invented here” with a strong bias toward building internally, today’s corporate posture leans in a much different direction, with many thinking about how to disrupt themselves before an external party beats them to it. Not surprisingly, this has created more corporate and startup partnerships. While getting this type of collaboration right is beneficial for both parties, if you speak to most startups selling into large enterprises or corporate executives looking to partner with startups today, you will find many justifiable frustrations on both sides. As the vice president of Business Development at RRE Ventures, an early-stage venture capital firm based in New York, a major part of my role is leading our business development initiatives, where we enable collaboration between corporations and startups. Before this role, I spent time on the corporate side and on the startup side, so I’ve gotten to see this dynamic from both angles throughout my career.


Decentralisation: the next big step for the world wide web

an illustration of a decentralised network looking like constellations in a late evenan illustration of a decentralised network looking like constellations in a late evening sky
If it is done right, say enthusiasts, either you won’t notice or it will be better. One thing that is likely to change is that you will pay for more stuff directly – think micropayments based on cryptocurrency – because the business model of advertising to us based on our data won’t work well in the DWeb. Want to listen to songs someone has recorded and put on a decentralised website? Drop a coin in the cryptocurrency box in exchange for a decryption key and you can listen. Another difference is that most passwords could disappear. One of the first things you will need to use the DWeb is your own unique, secure identity, says Blockstack’s Ali. You will have one really long and unrecoverable password known only to you but which works everywhere on the DWeb and with which you will be able to connect to any decentralised app. Lose your unique password, though, and you lose access to everything. ... The decentralised web isn’t quite here yet. But there are apps and programs built on the decentralised model.


In the IoT world, general-purpose databases can’t cut it

In the IoT world, general-purpose databases can’t cut it
We live in an age of instrumentation, where everything that can be measured is being measured so that it can be analyzed and acted upon, preferably in real time or near real time. This instrumentation and measurement process is happening in both the physical world, as well as the virtual world of IT. For example, in the physical world, a solar energy company has instrumented all its solar panels to provide remote monitoring and battery management. Usage information is collected from a customers’ panels and sent via mobile networks to a database in the cloud. The data is analyzed, and the resulting information is used to configure and adapt each customer’s system to extend the life of the battery and control the product. If an abnormality or problem is detected, an alert can be sent to a service agent to mitigate the problem before it worsens. Thus, proactive customer service is enabled based on real-time data coming from the solar energy system at a customer’s installation.


Don’t Rely on the Cloud Provider to Protect Your Data

Unfortunately, many organizations that rely on cloud services for mission-critical applications assume that there’s no need to protect the data and apps that live there. The survey above sheds light on this issue, showing that most IT organizations are either using just scripts (25 percent) or nothing at all (23 percent) to back up their AWS data. And when it comes to disaster recovery for AWS, only 19 percent have plans in place. Thirty-six percent are working on a plan, but an alarming 45 percent have no DR plan at all. Those who are relying on their cloud provider to protect their data are making a huge mistake. That's because the big cloud services, such as Amazon EC2 or S3, are exceptionally durable and redundant, they assume that the cloud’s architecture mitigates any risk of downtime from a failure, error, outage or security attack. This is wrong. The end-user license agreements of AWS and most large public clouds put final responsibility for data on the customer. While the “cloud” itself is secured by AWS, everything within that cloud is the customer’s responsibility.



Quote for the day:


"Leadership in the past was a model of direction and control. Now it should help people set directions for the future and facilitate their delivery." -- John Bailey


Daily Tech Digest - September 08, 2018

Why the cloud is the data, and the data is the cloud?


First, we have the use of easy-to-provision and auto-scaling virtual machines that provide a platform for widely distributed “share nothing” database operations. This provides a divide-and-conquer approach to gathering data from both structured and unstructured sources. It’s the ‘secret sauce’ behind the newfound “Hadoop-y” speed that was really not there in the world of traditional relational databases. Second, we have the ability to deliver data using data services that combine behavior and information. This places the database operations behind a well-defined API or service. For the most part, these are simple services that act very much like a traditional database query, or just produce data as requested from a single data source. However, these cloud-based data services, or, cloud APIs, are becoming complex. They can mash up data from multiple sources and externalize that data using a single interface. Thus, you may be able to ask a single question about the existing state of the company and have a service that considers data in hundreds, perhaps thousands of databases, using up-to-date operational data, to come back with a single meaningful answer.



Proactive approach to defending computer systems

"The concept of MTD has been introduced with the aim of increasing the adversary's confusion or uncertainty by dynamically changing the attack surface, which consists of the reachable and exploitable vulnerabilities," Cho said. "MTD can lead to making the adversary's intelligence gained from previous monitoring no longer useful and accordingly results in poor attack decisions." The basic idea as it applies to IP addresses on computer networks is this: Change the IP address of the computer frequently enough so the attacker loses sight of where his victim is; however, this can be expensive, so the approach taken by the researchers in the collaboration here uses something known as software-defined networking. This lets computers keep their real IP addresses fixed, but masks them from the rest of the internet with virtual IP addresses that are frequently changing. Moore added that as the adage suggests, it is harder to hit a moving target. "MTD increases uncertainty and confuses the adversary, as time is no longer an advantage," Moore said.


Key Algorithms and Statistical Models for Aspiring Data Scientists

Algorithm Design Topics
As a data scientist who has been in the profession for several years now, I am often approached for career advice or guidance in course selection related to machine learning by students and career switchers on LinkedIn and Quora. Some questions revolve around educational paths and program selection, but many questions focus on what sort of algorithms or models are common in data science today. With a glut of algorithms from which to choose, it’s hard to know where to start. Courses may include algorithms that aren’t typically used in industry today, and courses may exclude very useful methods that aren’t trending at the moment. Software-based programs may exclude important statistical concepts, and mathematically-based programs may skip over some of the key topics in algorithm design. ... Because machine learning is a branch of statistics, machine learning algorithms technically fall under statistical knowledge, as well as data mining and more computer-science-based methods.


Windows 10 Enterprise customers will now get Linux-like support

Effective this month, for enterprise customers willing to pay the Enterprise edition premium, Microsoft is granting an extra year's support. The new changes are designed to encourage slow-moving enterprises to pick up the upgrade tempo for hundreds of millions of Windows 7 PCs, before that older OS reaches its retirement date in less than 500 days. Today's announcements are the latest twist in a series of changes and extensions in the three years since Windows 10's initial release in 2015. In November 2017, Microsoft extended support for version 1511 by six months, to April 2018. (The blog post announcing that change is no longer online.) Then, in February 2018, Microsoft announced similar six-month "servicing extensions for Windows 10," but this time with a noteworthy gotcha: The new, 24-month support lifecycle applied only to Enterprise and Education editions. If your organization has devices running Windows 10 Pro, they need to be updated every 18 months or sooner.


AWS vs. Azure: Users Share Their Experiences

Image: Shutterstock
What makes a user switch to AWS or Azure from their previous approach to handling a workload? The answers reveal what’s working well with these two cloud solutions. For example, it_user396519, another Amazon Redshift user, had been using an on-premise MySQL data warehouse. He switched to AWS “to reduce the cost and improve scalability.” Or, consider itmanage402807 a SQL Azure user who moved to the Microsoft cloud after evaluating databases like PostgreSQL. He explained, “We decided to switch because our .NET application works well with Microsoft solutions.” Adi L., an AWS Dynamo DB User at a healthcare company, chose AWS when he was also faced with the option of continuing with PostgreSQL. He noted, “We switched to DynamoDB for the scalability and ease of deployment and operation.” Wagner S. , an Amazon EC2 user, switched from the Google Cloud Platform “because Amazon AWS offers more services and a lot more settings.”


Phishing alert: North Korea's hacking attacks shows your email is still the weakest link

The North Korean group accused of some of the biggest cyber crimes ever conducted may have harnessed some highly sophisticated technologies, but their ability to break into computer networks worldwide often relied on nothing more than a bogus email. The US Department of Justice has formally charged a North Korean programmer for his part in some of the largest cyber-attacks in recent years, conducted by a group backed by the North Korean government. The 172-page criminal complaint published by the US Department of Justice provides an unprecedented insight into the workings of one of the most notorious hacking groups on the planet, but also shows how their most successful attacks were at least in part down to a blizzard of fake -- phishing -- emails. The group's activities allegedly include the devastating attack on Sony Pictures Entertainment in November 2014. The group launched their attack on the company in response to the movie The Interview, a comedy that depicted the assassination of North Korea's leader.


How the Equifax hack happened, and what still needs to be done


Equifax as a company hasn't faced many consequences. In January, Democratic senators proposed a law that would require credit-reporting agencies to protect the data they've amassed and pay a fine if they're hacked. The bill never went anywhere. "One year after they publicly revealed the massive 2017 breach, Equifax and other big credit reporting agencies keep profiting off a business model that rewards their failure to protect personal information -- and the Trump Administration and the Republican-controlled Congress have done nothing," Sen. Elizabeth Warren, a Democrat from Massachusetts, said in a statement. Warren isn't alone. At a House Energy and Commerce Committee hearing on Wednesday, where the focus was on Twitter and its CEO, Jack Dorsey, Rep. Ben Lujan pivoted his attention to Equifax. "We've not done anything as well for the 148 million people that were impacted by Equifax," said Lujan, a Democrat from New Mexico. "I think we should use this committee's time to make a difference in the lives of the American people and live up to the commitments that this committee has made: provide protections for our consumers."


Swaying the C-Suite: Proving the ROI of a sound security strategy

When you think about it, it does make sense that the C-suite and security or operations teams don’t speak the same language. Senior leaders are often tasked with cutting the fat. At the same time, organizations struggle to quantify the value of cybersecurity investments. It's important for IT and security leaders to note that true ROI comes from defending the organization against material impact, before it happens. Begin by proving your position with numbers. For example, cybercrime is estimated to cost approximately $6 trillion per year on average through 2021. As such, smart security spend pays for itself in cost savings, reputation protection and more, given the direct connection between loss prevention and a company's bottom line. We're facing a reality in which organizations understand they need to care about security, but to really get executive buy-in, the security team still needs to prove ROI — the right kind of ROI — and present a clear implementation plan. After providing facts and figures, a security roadmap can help highlight the tactical actions needed to sway the C-suite to commit and spend.


The simple fix so your cloud costs don’t spin out of control

The simple fix so your cloud costs don’t spin out of control
There is an easy fix, and it’s called cloud cost management or cloud usage management. It comprises the processes, approaches, and tools that let you keep cost in check—and, most important, keep those costs predictable. These are cloud cost governance tools to monitor usage and the associated costs. They do so by workload, by user, by department, or byany other way you want to slice it. These tools not only let you see who’s using what and when, and how much it costs, but do chargebacks and showbacks to make sure that the right budgets are funding the cloud usage. Perhaps the most important aspect of this technology is that you can set predetermined limits. This includes setting usage parameters such as not provisioning the most expensive instances of storage all the time and making sure that budget restrictions are adhered to. Ironically, even enterprises that are the most controlling when it comes to costs tend to think of cloud costs as something that’s unknown and so just accept whatever rolls in. No one knows what the bill will be, nor expects to.


Facebook, Twitter Defend Fight Against Influence Operations

Numerous technology firms have disputed the notion that there is any political bias in their algorithms. "These accusations are not borne out by data and facts, and they have been widely discredited by major news organizations and experts," a coalition of technology industry groups said in a letter to the committee. But some industry watchers have suggested that while they see no political bias, private social media firms should be more transparent about how their algorithms work as well as their content management policies. "Charges of left-leaning bias are not new, of course," says Tarleton Gillespie, a principal researcher at Microsoft Research and an affiliated associate professor at Cornell University, on TechDirt. "They come from a very old playbook conservatives have used against newspapers and broadcasters for decades. Unfortunately, Silicon Valley is partly to blame for why it is working so well today. Search engines and social media platforms have been too secretive about how their algorithms work and too secretive about how content moderation works."



Quote for the day:


"Leadership is particularly necessary to ensure ready acceptance of the unfamiliar and that which is contrary to tradition." -- Cyril Falls


Daily Tech Digest - September 06, 2018

IBM researchers build AI-powered prototype to help small farmers test soil

2-agropad-photo.jpg
The AgroPad is a paper device about the size of a business card. It has a microfluidics chip inside that can perform a chemical analysis of a water or soil sample in less than 10 seconds. A farmer simply puts his sample on one side of the card, and on the other side, a set of circles provides colorimetric test results. Using a dedicated smartphone app, the farmer can receive immediate, precise results. The app uses machine vision to translate the color composition and intensity into chemical concentrations, with results more reliable than those that rely on human vision. The current prototype measures pH, nitrogen dioxide, aluminum, magnesium and chlorine, though the research team is working on extending the library of chemical indicators. AgroPads could be personalized based on the needs of the individual farmer. Once the test results are in, the data can be streamed to the cloud and labeled with a digital tag to mark the time and location of the analysis. Results for millions of individual tests could be stored.



Microchip 'god mode' flaw: Is it time to rethink security?


This particular vulnerability might be described as a type of hardware backdoor, in which undocumented CPU instructions can take a process from an operating system's Ring 3, the least privileged level of access to resources, directly to Ring 0, the most privileged level of access to resources. Ring 3 is where applications run, and keeping them there keeps them from tinkering with the data or code that other applications use. Ring 0 is reserved for the operating system itself, which manages the resources that all running processes can access. An application needs to be in Ring 0 to enable this backdoor, but Domas found that some systems seem to have been shipped with the backdoor already enabled. Software running in Ring 0 can potentially bypass any security mechanism of other processes. If a process uses a password or cryptographic key, another process running in Ring 0 may also be able to get that password or key, thus virtually eliminating the security it provides.


How blockchain technology could aid key data challenges


The current model for storing data is by keeping said data stored in one place. For example, a Microsoft Word document is saved to a desktop. While access to that document may be made through the server, even remotely, it is still saved in a single, centralized location. Blockchain is, arguably, the exact opposite. Data is stored as the “block” of the technology. The blockchain, in its entirety, is an encrypted ledger that is replicated throughout the database. The data (block) is decentralized through this replication. In other words, it is not saved to a single place, but instead exists across all blocks in the network. Even though the ledger exists in a public space, a private key is required to access a specific block—it enables data to be distributed, but not copied. This manner of data storage protects information from ransomware and hacking attacks by requiring a hacker to simultaneously breach and affect every block in order to render any damage, as opposed to corrupting or stealing just one document in the current centralized version of data storage.


Is a developer career right for you? 10 questions to ask yourself

Developers are among the most in-demand tech professionals in the workforce, with high salaries offered to those with the right skill sets. While learning to code and breaking into a new career may seem daunting, the high number of open jobs and training opportunities could make development a great option for many people. "A lot of developers suffer with imposter syndrome and feeling like they don't have enough knowledge or experience to apply for a developer position," said Cristina Blanchard, a front end web developer at Brew Agency. "The truth is, if you have a solid knowledge and understanding of the most basic, core concepts of development, you can learn just about anything with the right training and a little tenacity. Don't be afraid to apply for a position you feel you may be under-qualified for, because you never know who may be willing to train you or help you get the experience you need."


Government projects watchdog recommends terminating Gov.uk Verify identity project


Sources suggest that GDS hopes to make a case to continue with Verify. Just this week, it announced three further digital services using Verify had reached the “private beta” testing stage, although none of the services have a launch date. ... GDS is also understood to be making a case that Verify remains essential to the ongoing roll-out of Universal Credit, the government’s new benefits system. But even there, the Department for Work and Pensions has had to develop an additional identity system after finding that hundreds of thousands of benefits applicants could be unable to register successfully on Verify. ... There are also question marks over the commitment of the IDPs – also known as certified companies. A report by McKinsey for the Cabinet Office showed that more than 80% of users chose two of the seven IDPs – Experian and the Post Office – leaving Digidentity, Royal Mail, Barclays, Citizen Safe and Secure Identity to pick up the remainder between them.


Designing a Usable, Flexible, Long-Lasting API

Most APIs aren't truly REST APIs, so if you choose to build a RESTful API, do you understand the constraints of REST including hypermedia/HATEOAS? If you choose to build a partial REST or REST-like API, do you know why you are choosing to not follow certain constraints of REST? Depending on what your API needs to be able to do and where your API will be used, legacy formats such as SOAP may make sense. However, with each format comes a tradeoff in terms of usability, flexibility, and development costs. Finally, as we start to plan our API, it's important to understand how our users will interact with the API and how they'll use it in conjunction with other services. Be sure to use tools like RAML or Swagger/OAI during this process to involve your users, provide mock APIs for them to interact with, and to ensure your design is consistent and meets their needs. As you design your API, it's also important to remember that you're laying a foundation to build upon at a later time.


How to Cultivate Security Champions at the Workplace

How to Cultivate Security Champions at the Workplace
Some things you consider simple are things that can make a big impact on people. Think even smaller, visiting with people one-on-one as time and events present themselves. Last note, there is no better time than an incident debrief to educate users one-on-one or in a group. The point is to get people’s attention. Show them why security is important. Show how easy it really can be for malicious actors to reign havoc in your environment. Show how they can have a direct impact in helping to prevent that. A few people will take it to heart and develop a security mindsight. Many people in information security are problem solvers. Approach it that way. Demonstrate to them how a malicious actor could easily attack your AD / Kerberos infrastructure. See how many ask what can be done to mitigate it. Instead of answering, ask them what they would do, what they can think of. Make it a problem for them to solve. Just keep your audience in mind. What will entice one audience, say demonstrating the intricacies of Kerberoasting to your server administrators, will be lost on business partners.


Cloud computing: Three reasons why it could be time to go cloud-first

"There's data governance questions and there might be clients that don't allow us to pass their data to the cloud -- and that's why there might be a situation why we can't go on demand. But our default option will always be to take the cloud option first if we can. And that approach will be multi-cloud where we'll use a range of providers." Kay says he doesn't believe the cloud is a one-size-fits-all situation right now. He says there's still a bit of an arms race taking place and that different providers have different strengths. "Some are better at doing things better than others. And we, therefore, want to be able to take advantage of those capabilities," says Kay. "So, we will not be dogmatic and push everything to a single provider. We're trying all of them -- at the SaaS level, we're using Salesforce, Microsoft Dynamics 365 and Workday. When it comes to IaaS and PaaS, we're using AWS, Azure and Google. That's a deliberate strategy. We have a view on which provider is stronger for a particular set of characteristics."


Four Ways to Take Charge in Your First Agile Project


Creating an environment of psychological safety is imperative for a high-performing team. Google conducted a two-year long study on team collaboration and found that when individuals felt that they could share their honest opinions without fear of backlash, they performed far better. When employees feel that their opinions matter, their engagement levels peak and, according to Gallup’s research, their productivity increases by an average of 12%. But unfortunately, this doesn’t always happen, especially when teams are new to Agile and Scrum. The Agile Report found that one of the top challenges reported while adopting an Agile approach was its alignment to cultural philosophy, along with lack of leadership support and troubles with collaboration. All of these issues are related to people’s personalities, including their strong points and areas of weakness. If a strong leader is put in place and a strategy is designed to work to each member’s strength, many of these problems can diminish.


Ransomware Recovery: Don't Make Matters Worse

Ransomware Recovery: Don't Make Matters Worse
"Trying to decide whether to pay ransom or not is never an easy decision - the best answer is 'no, never' but that's not always a decision you can make," says former healthcare CIO David Finn, executive vice president of the security consultancy CynergisTek. "You will rarely negotiate from a position of strength with a hacker - or any criminal, for that matter. Having a well thought out plan would have helped, and certainly being able to restore the data yourself, without 'buying' decryption might have avoided the entire nasty event." An organization that chooses to pay attackers to unlock data "should apply the decryption key itself with whatever instruction the criminals can provide - instead of sending a file to the ransomware perpetrators to decrypt as evidence the key works," suggests Keith Fricke, principal consultant at tw-Security. "If possible, it is a good practice to have a third-party vendor pay for the decryption key on behalf of the [organization]," he adds.



Quote for the day:


"Give whatever you are doing and whoever you are with the gift of your attention." -- Jim Rohn