Daily Tech Digest - July 16, 2018

echo dot and home mini
The most pronounced difference between the two speakers is in their respective digital assistants: Amazon Alexa and Google Assistant. Note that both are constantly evolving and adding new features and capabilities, so any comparison is based on a snapshot in time. That said, tests of both products by TechHive and other tech publications all generally agree: Amazon Alexa excels as a tool for ordering stuff, while Google Assistant wins out when it comes to general search and information requests. Both platforms are pretty good when it comes to controlling other smart home devices and systems, although Amazon was more aggressive early on when it came to working with third-party developers. Google, however, has come a long way on that front. So if you envisage your primary use to be adding items to your Amazon shopping cart—“Alexa, reorder coffee”—then Alexa is the way to go. If you want to use it less for shopping and more for information—“Hey Google, how long will it take for me to get to Sacramento by train?”—then Google might have the edge (Alexa responded to that query with driving time.) If you’re looking to control your other devices in your home, check to see which platform is the most compatible with what you have. More on that in a bit.



Are security professionals moving fast enough?

Are security professionals moving fast enough? image
Of course, it is difficult for security professionals to pick apart the wheat from the chaff when it comes to machine learning and AI. Unfortunately, there are many vendors simply slapping on AI to their messaging, but if you scratch beneath the surface, it’s nothing more than words. This makes it harder for organisations to know if what they are being promised is true and can lead to much cynicism, perhaps a reason why so few businesses are investing in these technologies. It’s more important now than ever before that enterprises shift from the manual and into the automated world, and harness technologies that can carry out some of this heavy lifting. Regulation, such as the GDPR, almost makes this an imperative with the stipulated reporting timelines. The job of a technology team has become much harder with the increased amount of cyber threats and how rapidly they are evolving. So if there are ways to save time on other jobs surely they should be grasping them with both hands. Now it’s time for security professionals to pick up the pace.


4 essential questions to audit your agile process

4 essential questions to audit your agile process
You check your blood pressure, tire pressure, and your stock prices. But when was the last time you audited your agile? Even experienced agilists can fall into bad habits, and it’s important to catch them early. That’s why I recommend auditing your agile process every six months. It might sound daunting, but everything you need boils down to four questions you can fit on a standard 3x5 notecard. During your next retrospective, ask your team to answer each of the following questions with a five-star rating scale. Five stars means you have a superawesome process, and one star essentially means you have no process or it’s really poor. Of course, most scores will be in between, but they will help your team focus on improving the weakest points. ... Each person on the team is responsible for calling out where stories lack clarity. Poorly constructed stories result in churn and wasted time. Developers, quality engineers, and product owners must agree on definition, business value, requirements, and internal dependencies. Otherwise, you’ll find bugs, pushbacks, and failure to sign off on a completed story.


What Is A Net Promoter Score (NPS)?

vacation policy builds loyalty
Once you gather the survey data, your company’s NPS is determined by subtracting the percentage of detractors from the percentage of promoters, while passives count towards the total number of respondents. ... It’s easy enough to calculate your organization’s NPS manually, but if you want to outsource the process, there are third-party services that will help you send out surveys and determine your score. ... A good net promoter score is technically anything above zero, which means you have more promoters than detractors. The worst score you can get is a -100, which means you do not have a single promoter and that all your customers are detractors – vice versa for a score of 100. A score of 50 or more is considered excellent. ... The result of NPS is a straightforward metric that companies can use to gauge customer loyalty and the health of the company’s brand. It’s just one question, but it’s an important metric for helping businesses understand where they stand in the market and determine whether their effort is better spent on maintaining customers' satisfaction or if it’s time to try winning back unhappy customers.


Microsoft Teams free version takes on Slack, Cisco Webex

With the free Teams product, Microsoft is telling it's largest rivals -- Cisco and Slack -- that the company is in the market "to win it -- or at least significantly disrupt it," Kurtzman said. However, the competitors have advantages. Slack has more than 1,500 third-party app integrations, and Cisco's Webex Teams is a video-centric collaboration platform that works well with Cisco's networking hardware and software. Microsoft is preparing for battle by simplifying its collaboration portfolio. The company has said it will replace Skype for Business Online with Teams, a move that raised concerns that Teams won't have the same telephony tools. Microsoft has tried to ease customer anxiety by rolling out Teams calling features, such as call delegation and direct routing. Call delegation lets a user receive someone else's call -- a necessary feature within enterprises. Direct routing enables companies to use their existing telephony infrastructure with Teams. However, accessing that function requires a company to have Teams and Phone System -- formerly called Cloud PBX -- as part of an Office 365 subscription.


All you need to know about the move from SHA-1 to SHA-2 encryption

All you need to know about the move from SHA-1 to SHA-2
SHA-2 is the cryptographic hashing standard that all software and hardware should be using now, at least for the next few years. SHA-2 is often called the SHA-2 family of hashes because it contains many different-size hashes, including 224-, 256-, 384-, and 512-bit digests. When someone says they are using the SHA-2 hash, you don’t know which bit length they are using, but the most popular one is 256 bits (by a large margin). Although SHA-2 shares some of the same math characteristics as SHA-1 and minor weaknesses have been discovered, in crypto-speak it's still considered "strong” for the foreseeable future. Without question, it's way better than SHA-1, and any critical SHA-1 enabled certificates, applications, and hardware devices using SHA-1 should be moved to SHA-2. All major web browser vendors (e.g. Microsoft, Google, Mozilla, Apple) and other relying parties have requested (and have been doing so for years) that all customers, services and products currently using SHA-1 move to SHA-2, although what has to be moved by when is different depending on the vendor.


Quantum-secured network ‘virtually un-hackable’

Quantum-secured network ‘virtually un-hackable’
Photons, as used in the quantum-key distribution work, will likely end up securing future networks and could turn out to be a crucial element to upcoming quantum computing overall. The particles of light are good for moving qubits (quantum information carriers) because they can travel distances and work with fabricated chips, explained the University of Maryland in a news article announcing what it said is a breakthrough in photon-carried quantum computing. The school said it has invented the first single-photon transistor from a semiconductor chip — a photon transistor, in other words. Traditional transistors are the miniscule routing switches used in every form of computing. Producing a photon-based one, where the switches interact with each other, could “attain exponential speedup for certain computational problems.” Photons don’t natively interact — a prior downside. “Roughly 1 million of these new transistors could fit inside a single grain of salt. It is also fast and able to process 10 billion photonic qubits every second,” the school said. “Quantum communications technologies are starting to play a significant role in securing our data and communications," said Dr. Grégoire Ribordy


What Is Geospatial Data and How Can It Save Your Life?

Using the in-memory database and application platform SAP HANA, SAP has developed a prototype that helps organizations analyze geospatial data and predict how storms can impact a given region. After years of collaboration with Esri, a leader in geographical information systems, the two companies announced tighter integration between SAP HANA and Esri’s “geodatabase” in January. This allows customers to analyze geographic information within their business processes and take action more easily. Previously, customers had to analyze location data separately from business applications, then combine them. As Hasso Plattner, co-founder of SAP and chairman of the Supervisory Board of SAP SE, pointed out at SAPPHIRE NOW, SAP just took spatial capabilities one step further and released them as services that can pull weather or satellite data directly from providers into the enterprise data layer. Customers can now create location-aware application more quickly using this functionality, part of the recently-announced SAP HANA Data Management Suite.


EU Lawmakers Threaten Business Relying On Privac Shield

EU lawmakers threaten businesses relying on Privacy Shield
The EU’s General Data Protection Regulation, like its predecessor the Data Protection Directive, authorizes the export of EU citizens’ personal information only to jurisdictions that provide an adequate level of privacy protection. Privacy Shield, an agreement signed by EU and U.S. officials in 2016, seeks to reconcile the different levels of legal protection afforded on each side of the Atlantic, allowing businesses to export EU citizens’ data to the U.S. for processing. The EU’s executive body, the European Commission, ruled in 2016 that the Privacy Shield deal provided adequate protection for personal information, but called for it to be reviewed annually. It’s with an eye on the next review of the agreement, in September, that Members of the European Parliament called for the deal to be suspended in a vote on July 5. The Parliament’s resolution on Privacy Shield identified several areas in which U.S. authorities had not yet met their commitments under the agreement, despite having been given a deadline of May 25, 2018. The U.S. Senate has still not ratified the appointment of three members of the Privacy and Civil Liberties Oversight Board (PCLOB), including its chairman.


5 Essentials to Achieve IT Resilience

5 Essentials to Achieve IT Resilience
Many cloud backup and storage solutions have appeal because they offer cloud storage and data access and restore from anywhere. However, such solutions don’t offer capabilities that allow users to totally recover applications, servers, and entire business operations in a tight timeframe. Because of this, companies require IT resilience that is affordable and effective. This means that solutions must offer automated, seamless access to your data and applications. But what do solutions need, specifically, to achieve resilience? A few key elements of technology must be present for IT Resilience and Assurance (ITRA) to be achieved. These components are anomaly detection, backup, deduplicated file system assisted replication, orchestration, and assurance. That’s a lot to take in one sentence, so let’s break them down! ... Anomaly detection is a feature that enables users to predictively detect a risk to their systems. This capability allows users to receive an early warning if activity happening with their data could be related to a ransomware or other kind of malware attack. Signs that a ransomware attack is occurring include affected files being renamed, causing them to appear to be new files when backed up.



Quote for the day:


"No one really succeeds everyday but successful people do something everyday to help themselves succeed." -- @LeadToday


Daily Tech Digest - July 15, 2018

“Enterprise Architecture As A Service” – What?


Recent success results in organizations having to deal with big decisions on ways to invest and maintain their success. Perceived failure results in a need to make decisions to address the failures. Each of these scenarios gets attention during the strategic planning process and, as pointed out in “Enterprise Architecture as Strategy” by Jeanne W. Ross, Peter Weill, and David Robertson, Harvard Business School Press, 2006, EA is a useful tool. The bottom line is that big decisions are looming and there is a perception that EA can help by defining “the organizing logic for business processes and IT Infrastructure, reflecting the integration and standardization requirements of the company’s operating model” so that “individual projects can build capabilities – not just fulfill immediate needs”. But there is another, less positive, perception out there – EA can be a money sink! It could result in tons of paper, take years, result in something outdated by the time it is finished, just to name a few concerns. Also the need for change has a timeline shorter than the perceived timeline of generating an Enterprise Architecture. 


HTC’s blockchain phone is real, and it’s arriving later this year

phone-components_desktop
Prior to the launch, the company is partnering with the popular blockchain title, CryptoKitties. The game will be available on a small selection of the company’s handsets starting with the U12+. “This is a significant first step in creating a platform and distribution channel for creatives who make unique digital goods,” the company writes in a release tied to the news. “Mobile is the most prevalent device in the history of humankind and for digital assets and dapps to reach their potential, mobile will need to be the main point of distribution. The partnership with Cryptokitties is the beginning of a non fungible, collectible marketplace and crypto gaming app store.” The company says the partnership marks the beginning of a “platform and distribution channel for creatives who make unique digital goods.” In other words, it’s attempting to reintroduce the concept of scarcity through these decentralized apps. HTC will also be partnering with Bitmark to help accomplish this. If HTC is looking for the next mainstream play to right the ship, this is emphatically not it.


Interview: Bill Waid talks about AI ML


What is interesting about this well-known and often referenced use of AI/ML, is the potential opportunity cost. Despite the significant savings realized, the impact of declining a customer transaction that was not fraudulent leads to and even more costly unsatisfactory customer engagement and eventual attrition. To operationalize this AI/ML solution and fully realize the value, decisioning and a continuous improvement feedback loop was required. Capitalizing on the power of AI/ML, FICO has expanded both the algorithms and application of AI/ML to a broad set of solutions since 1992. Most notable is the use of ML to find predictive patterns in the ever-expanding Data Lakes our clients are collecting and using those ML findings to augment existing decisions and incrementally improve business outcomes. By deploying ML models in a way that the decision outcome could be managed and monitored to ensure accuracy, business owners could learn from the ML model and gain confidence that the model was indeed providing tangible improvement. This last innovation was a natural evolution to what FICO refers to as explainable AI (xAI).


How AI will change your healthcare experience (but won’t replace your doctor)

AI in healthcare
Techniques such as machine learning enable healthcare providers to analyze large amounts of data, allowing them to do more in less time, and supporting them with diagnosis and treatment decisions. For example, suppose you feed a computer program with a large amount of medical images that either show or do not show symptoms of a disease. The program can then learn to recognize images that may point towards the disease. For example, researchers at Stanford developed an algorithm that helps to evaluate chest X-rays to identify images with pneumonia. This doesn’t mean, however, that the radiologist will no longer be needed. Instead, think of AI as a smart assistant that will support doctors, alleviating their workload. This is also how we approach AI at Philips: we work together with clinicians to develop solutions that make their lives easier and improve the patient experience. That’s why we believe in the power of adaptive intelligence. It’s not really about AI per se – it’s about helping people with technology that adapts to their needs and extends their capabilities.


Machine learning will redesign, not replace, work

"Any manager could take this rubric, and if they're thinking of applying machine learning this rubric should give them some guidance," he said. "There are many, many tasks that are suitable for machine learning, and most companies have really just scratched the surface." ... Since a job is just a bundle of various tasks, it's also possible to use the rubric to measure the suitability of entire occupations for machine learning. Using data from the federal Bureau of Labor Statistics, that's exactly what they did—for each of the more than 900 distinct occupations in the U.S. economy, from economists and CEOs to truck drivers and schoolteachers. "Automation technologies have historically been the key driver of increased industrial productivity. They have also disrupted employment and the wage structure systematically," the researchers write. "However, our analysis suggests that machine learning will affect very different parts of the workforce than earlier waves of automation … Machine learning technology can transform many jobs in the economy, but full automation will be less significant than the reengineering of processes and the reorganization of tasks."


Reinventing The Enterprise - Digitally

Through autonomization and emergence, self-tuning firms create significant advantages. They can better understand customers by leveraging data from their own ecosystems and platforms to develop granular insights and automatically customize their offerings. They can develop more new, marketable products by experimenting with offerings and leveraging proprietary data. And they can implement change more quickly and at lower cost by acting autonomously.  The benefits of autonomization and emergence well exceed those that can be realized from digitization programs aiming to increase efficiency or product innovation alone. They are compounded by self-reinforcing network and experience effects: better offerings attract more customers and more data; experimentation brings knowledge that increases the value of future experimentation. One example of a self-tuning organization is Alibaba. Not only does its e-commerce platform provide a sea of user data, but the company uses it to generate real-time insights in a granular manner.


Two studies show the data center is thriving instead of dying

data center
The top reasons for such investment are security and application performance (75% of respondents) and scalability (71%). It also found that 53% of respondents intend to increase investment in software-defined storage, 52% in NAS and 42% in SSD ... IHS noted that new technologies such as artificial intelligence and containers are gaining traction, traditional data center apps, such as Microsoft Office (22%), collaboration tools such as email, SharePoint, and unified communications (18%), and general-purpose IT apps (30%) are still being used. The second survey comes from SNS Telecom & IT, a market research firm based in Dubai, UAE. It attributes the growth in big data and the subsequent massive inflow of all sorts of unstructured data as the reason for investment in IT equipment by the financial services industry. “As this Big Data construct expands to include streaming and archived data along with sensor information and transactions, the financial sector continues its steady embrace of big data analytics for high-frequency trading, fraud detection and a growing list of consumer-oriented applications,” said the authors.


Despite the security measures you've taken, hacking into your network is trivial

Closing security vulnerabilities and establishing effective cybersecurity policies and procedures is going to require more than just better technology. Effective security will demand a complete change of attitude by every employee, executive, and individual operating a computing device. Security must become the priority, even at the expense of convenience. Confirming results reported in other studies, the Positive Technologies research showed that more than a quarter of employees still inexplicably clicked a malicious link sent to them in an email. Despite extensive training and retraining, employees--regardless of industry or level of technical knowledge--continue to operate with an almost unconscious lack of security awareness. Until this cavalier attitude toward protecting company data changes, phishing attacks and authentication circumvention will continue to plague the modern enterprise.


The Economics Of AI - How Cheaper Predictions Will Change The World


Key to this, they argue, will be whether human AI “managers” can learn to differentiate between tasks involving prediction, and those where a more human touch is still essential. When I met with Joshua Gans – professor of strategic management and holder of the Jeffrey S Skoll Chair of Technical Innovation and Entrepreneurship at the University of Toronto – he gave me some insight into how economists are tackling the issues raised by AI. "As economists studying innovation and technological change, a conventional frame for trying to understand and forecast the impact of new technology would be to think about what the technology really reduces the cost of," he tells me. "And really its an advance in statistical methods – a very big advance – and really not about intelligence at all, in a way a lot of people would understand the term ‘intelligence.' ... “When I look up at the sky and see there are grey clouds, I take that information and predict that it’s going to rain. When I’m going to catch a ball, I predict the physics of where it’s going to end up. I have to do a lot of other things to catch the ball, but one of the things I do is make that prediction.”


Creating a Defensible Security Architecture

Controls should not only face the Internet but implemented to secure authorized access from internal assets to internal assets. Basic adjustments such as this allow for far superior prevention controls and, more importantly, detection controls. Think about this for a moment: If a computer on a subnet or zone A attempts to talk to any system found in zone B and the system from A is not allowed, then the connection will be denied, and you will be notified of that. Basic firewall rules aren't rocket science, but they are highly effective controls. Modern challenges also must be overcome. For instance, consider an intrusion detection/prevention device, web proxy, data loss prevention sensor, network antivirus, or any other Layer 7 network inspection solution. These are all crippled by network encryption. Your brand-new shiny NGFW may not be configured to handle 70%+ of the traffic going through it. Basically, without understanding technologies like Secure Sockets Layer (SSL) inspection, SSL decrypt mirroring, HTTP Strict Transport Security (HSTS), certificate transparency, HTTP Public Key Pinning (HPKP), how can you handle modern encryption?



Quote for the day:


"Technology makes it possible for people to gain control over everything, except over technology." -- John Tudor


Daily Tech Digest - July 14, 2018


To date, the tools which underpin workforces have been developed as a natural extension of traditional work flows. Email replaced the memo, and video chat made the conference call more collaborative. But emerging technologies like advanced analytics, artificial intelligence, and machine learning are primed to provide a comprehensive look into the patterns and intricacies that make up the individual workplace experience. For example, with the right platform, IT departments can better understand which channels employees prefer, what is drawing them to these channels, and how they can better optimize it for even further productivity. Alternatively, they can identify problem areas within work flows and proactively ease the strain on employees themselves. As technology becomes more advanced, the human element becomes increasingly vital. Digital transformation saw a seismic shift in the way IT leaders approach their infrastructure, but workplace transformation requires a deep understanding of the unique ways individuals approach productivity.


Entity Services Increase Complexity

Entity services are modelled after defined entities (or nouns) within a system. For example, an accounts service, order service and customer service. Typically they have CRUD like interfaces which operate on top of these entities.  By taking this CRUD like approach, entity services tend to not contain any meaningful business functionality. Instead, they are shallow modules, not really offering any complex or useful abstractions. ... Ultimately, these shallow entity services can turn into a cluster of highly coupled components, write Abedrabbo. This leads to an operational burden, where more components must be deployed, scaled and monitored. This high coupling can also lead to challenging release processes, where many microservices must be deployed in order to deliver a single piece of functionality.  It can also produce single points of failure, where many services depend on each other, meaning that if one fails it can bring down the entire system. Abedrabbo also explains that entity services create conceptual complexity, as the knowledge of how to compose them is not immediately obvious. 


 An exciting time to be in cyber security innovation


There is a wide range of initiatives specifically around cyber security in the UK, says Chappell, including the Cyber Growth Partnership, which supports fast-growing security companies. “There are some great opportunities is this sector, which is partly due to our UK heritage going back to Bletchley Park,” he says. The UK also benefits from having top students from all over the world who come to further their education, a thriving financial sector and a strong defence sector. “We are lucky to have this heady mix of components that create an environment where it is great to be building a business,” says Chappell. Also, thanks to the likes of companies such as Message Labs and Sophos, the UK has useful templates or archetypes for fast-growing successful businesses that startups can draw upon, he adds. The growing number of incubators is also creating opportunities for cyber security innovators, with Lorca being the latest to join its sister centre in Cheltenham, the NCSC Cyber Accelerator, CyLon and its HutZero bootcamp for entrepreneurs.


Reddit Co-Founder Alexis Ohanian's Top Self-Care Strategies for Entrepreneurs

“Entrepreneurs have to have enough ego to think that our crazy idea, our vision for the future is going to work, before anyone else does. But [it’s important to] balance that with enough humility to know that you aren’t going to have all the answers,” Ohanian says. “You are going to need to rely on different points of view. Get the benefit of someone who is detached enough to give you honest feedback, but attached enough to know all the players and background information.” Ohanian’s feeling is, if you wouldn’t expect a talented athlete or sports team to play without their coach, why shouldn’t it be the same for a great entrepreneur? ... “One of the things founders and CEOs in particular should always be doing and keeping top of mind is celebrating those wins for their business,” Ohanian says. “It will never feel like a 100 percent win for the CEO or founder, because you’re always thinking about the 100 other things that need to get improved or fixed. But for all the people on your team, it is really vital to celebrate them and that success. Not in a way that gets people complacent, but rejuvenated and re-excited about the mission and vision.”


Why You Should Consider A Career In Cybersecurity


Cybersecurity professionals are generally among the most highly-compensated technology workers. According to the United States Department of Labor, the median annual wages for information security analysts is almost $100,000 nationally, with many jobs in various locations paying considerably higher. With the demand for cybersecurity professionals continuing to far outpace the supply, salaries are likely to continue rising. As such, investing in cybersecurity training now can pay off quite handsomely ... For multiple reasons, many companies are far less likely to let go of cybersecurity professionals than they would other employees. Shrinking the security team may increase the likelihood of a breach, and can dramatically increase the impact of a breach should one occur; think for a moment about customers’ and regulators’ reactions to news reports that “A large amount of personal data leaked after company X tried to save money by reducing its cybersecurity staff.” Of course, as alluded to before, another deterrent against letting information security professionals go is that employers know that it is often both difficult and expensive to find suitable replacements.


Let There Be Sight: How Deep Learning Is Helping the Blind ‘See’

Guide dogs are great for helping people who are blind or visually impaired navigate the world. But try getting a dog to read aloud a sign or tell you how much money is in your wallet. Seeing AI, an app developed by Microsoft AI & Research, has the answers. It essentially narrates the world for blind and low-vision users, allowing them to use their smartphones to identify everything from an object or a color to a dollar bill or a document. Since the app’s launch last year, it’s been downloaded 150,000 times and used in 5 million tasks, some of which were completed on behalf of one of the world’s most famous blind people. “Stevie Wonder uses it every day, which is pretty cool,” said Anirudh Koul, a senior data scientist with Microsoft, during a presentation at the GPU Technology Conference in San Jose last month. A live demo of the app showed just how powerful it can be. Koul had a colleague join him on stage, and when he launched the app on his smartphone and pointed it toward his co-worker, it declared that it was looking at “a 31-year-old man with black hair, wearing glasses, looking happy.”


Graphing the sensitive boundary between PII and publicly inferable insights

window-1231894_1280-geralt-pixabay
There is a fuzzy boundary between information that’s personally identifiable and insights about persons that are publicly inferable. GDPR and similar mandates only cover protection of discrete pieces of digital PII that that are maintained in digital databases and other recordkeeping systems. But some observers seem to be arguing that it also encompasses insights that might be gained in the future about somebody through analytics on unprotected data. That’s how I’m construing David Loshin’s statement that “sexual orientation [is] covered under GDPR, too.” My pushback to Loshin’s position is to point out that it’s not terribly common for businesses or nonprofits to record people’s sexual orientation, unless an organization specifically serves one or more segments of the LGBTQ community — and even then, it’s pointless and perhaps gauche and intrusive to ask people to declare their orientation formally as a condition of membership. So it’s unlikely you’ll find businesses maintaining PII profile records stating that someone is gay, lesbian, bisexual or whatever.


Ultimate Guide To Blockchain In Insurance

Within insurance, the claims and finance functions are high-value areas where blockchain could be beneficial, especially when you look at processes that need ongoing reconciliation with external parties. Consider how often Company A has a claim against Company B resulting in the exchange of money, typically in the form of a paper check or an electronic transaction. That could be completely automated using blockchain. Presently, many insurers are applying a smart contract alongside the blockchain, which is triggered when well-defined terms and conditions are met. By setting up an insurance contract that pays out under these circumstances, an insurer can process transactions with no human intervention and greatly enhanced customer service. In other words, blockchain can help deliver on the digital opportunities that insurers must get right. These opportunities aren’t glamorous but they’re important: as I’ve said before, get them right and you won’t win—but get them wrong and you will lose. Blockchain can help insurers deliver on some brilliant basics.


Preparing Your Business For The Artificial Intelligence Revolution


Artificial intelligence can be used to solve problems across the board. AI can help businesses increase sales, detect fraud, improve customer experience, automate work processes and provide predictive analysis. Industries like health care, automotive, financial services and logistics have a lot to gain from AI implementations. Artificial intelligence can help health care service providers with better tools for early diagnostics. The autonomous cars are a direct result of improvements in AI. Financial services can benefit from AI-based process automation and fraud detection. Logistics companies can use AI for better inventory and delivery management. The retail business can map consumer behavior using AI. Utilities can use smart meters and smart grids to decrease power consumption. The rise of chatbots and virtual assistants are also a result of artificial intelligence. Amazon's Alexa, Google's Home, Apple's Siri and Microsoft's Cortana are all using AI-based algorithms to make life better. These technologies will take more prominent roles in dictating future consumer behavior.


Prime Minister Of Luxembourg Xavier Bettel On Technology, Culture And People

“Current” is always a bit of a difficult word when it comes to technology, because innovative ideas or products often grow and mature in waves. Consequently, over time, new technologies experience highs during which they are heavily publicized and on everybody’s mind. They also go through lows, during which they appear to be completely forgotten. Yet, the research continues! Having said that, I am actually very fond of the world of virtual and augmented reality. Yes, the technology, or at the very least the idea and concepts of VR and AR, have been around for quite some time now. But it is truly exciting to discover all the new opportunities these technologies offer us thanks to the recent advances in computing power, be it in the medical domain, in education, in transport…they make our world better and safer! ... In order to reap the full potential of our digital economy, European rules must ultimately enable and encourage our businesses and citizens to buy and sell their services and products anywhere in the European Union.



Quote for the day:


"The problem isn't a shortage of opportunities; it's a lack of perspective." -- Tim Fargo


Daily Tech Digest - July 13, 2018

Bill Hoffman of the Industrial Internet Consortium talks AI, IoT and more
Price point, availability, wired or wireless, then the fact that you can hook them up to the Ethernet – IPv6 provides almost no degradation of performance – so you can put a lot of stuff on it, which we also couldn’t do 30 years ago. We were all running Novell local area nets at the time! Who remembers Novell? So I think the technology has become much more robust and available, such that we’re able to use the big data and apply predictive analytics, and use these things in industrial systems that we couldn’t have even dreamed of 20 years ago. And when we say “industrial” [Internet of Things], the “Industrial Internet” is really an “industry” Internet, not just manufacturing per se. “Industrial Internet” was actually a term of art GE had coined, back in, I believe, 2013, and they didn’t trademark it intentionally, because they wanted it to remain a term of art. So when Richard [Soley] and I sat around the table with the five founders [of the Industrial Internet Consortium], we had hours of discussion about what to call this new entity we were going to create.



Cryptocurrency Exchange Developer Bancor Loses $23.5 Million

Some of Bancor's losses, however, are recoverable. Bancor says it has recouped $10 million worth of BNT, a type of token that facilitates trades within its exchange. How Bancor executed that recovery, however, leads into a heated debate among cryptocurrency enthusiasts. BNT differs from bitcoin in that it is a centrally generated token. New bitcoins are created through a process called mining, in which computers that verify transactions on the network are rewarded with a slice of bitcoin. BNT, like many other cryptocurrencies such as Ripple's XRP, Cardano's Ada, Block.one's EOS and Stellar's Lumens, isn't mined. These types of coins have powered Initial Coin Offerings, where an organization creates a centrally issued coin and sells it to raise funding. ICOs, which some contend could expose investors to fraud, are being closely analyzed by regulators around the world. The question is whether the coins or tokens that are issued are more like securities akin to stocks rather than an asset. The sale of securities often entails a different set of stricter trading rules


Peer Reviews Either Sandbag or Propel Agile Development


First, peers likely provide valuable feedback and have fresh eyes to catch mistakes that you might miss after spending hours working. Second, working on a fast-moving Agile team, you need to continually build consensus so that there is not a communication backlog. Lastly, for teams working in highly-regulated industries, peer reviews may be a required piece of a larger software assurance program. As more software development teams trend toward an Agile approach, software releases are becoming more frequent. If you are not able to speed up your peer review cycles in tandem, you may start to sacrifice quality to hit deadlines. That then translates to a buildup of technical debt. How can you avoid this scenario? It takes structure, but flexible structure. ... Most teams don’t have an explicit plan around their internal communications. The tools that they employ typically dictate the communication norms. If your team adopts Slack or another messaging app, then it quickly becomes common for folks to have short, timely chats. The expectation is that the other person replies within a relatively short timeframe.


Doing Performance Testing Easily using JUnit and Maven

Sometimes, we tend to think that performance testing is not part of the development process. This is probably due to no stories getting created for this, during the usual development sprints. This means the important aspect of a product or service APIs is not not taken care of. But that's not the point, the point is why do we think that it should not be part of the usual development cycle ? or... Why do we keep this towards the end of the project cycle? Also to add more ground to the above thinking, there are no straight forward approaches to doing performance testing like we do unit testing or feature/component testing or e2e integration testing or consumer-contract testing. Then the developers or the performance-testers (sometimes a specialized team) are asked to choose a standalone tool from the market place and produce some fancy reports on performance testing, share those reports with business or technology team. That means it is done in isolation and sometimes after or towards the end of the development sprints, approaching the production release date.


Government Bodies Are At Risk Online

Commitment to Online Trust and Security
Busy government staff don’t always have the time to learn cybersecurity best practice. Government employees working in departments such as planning, finance, human resources and the administration staff that support them, have intense workloads – so it’s important they can work quickly and efficiently, without compromising their safety online. It’s thought that as many as 95% of successful online hacks come down to human error. Mistakes are made by those who aren’t educated in online risks and can’t spot threats to their data. Sometimes it’s not a lack of knowledge, but a problem with relying solely on human performance. Even the most educated person can make mistakes that cause huge data breaches. Government organisations need to limit the risk of human error as much as possible. If it’s a case of staff reusing static or simple passwords that can be stolen using brute force attacks, then 2FA can be a solution. Once it has been used, successfully or unsuccessfully, then it becomes invalid.


Building the future of retail with the Internet of Transport

Wincanton wants to use sensors to automatically alert its employees to any potential deterioration in products during transportation. As part of this project, Gifford says the firm's technological efforts have produced developments in three key areas so far. He points first to Winsight, an app that enables a paperless cab, so all the paper lorry drivers normally carry, such as routes and proof of delivery, is wrapped up into a single piece of software on a smart device. The app is available to the firm's own drivers and sub-contractors. The second key element is telematics. "That's about us plugging into the vehicle's systems and sending information back to the business in a consistent way," says Gifford. Wincanton recently announced it will install MiX telematics in 1,800 of its vehicles as part of an ongoing safety programme, with information used to optimise driver performance. The final element is the implementation of a new, cloud-based transport management system (TMS). This TMS will form the basis for the firm's digital supply-chain strategy, with telematics helping to hone operational performance and Winsight helping to ensure business efficiency and effectiveness.


Here come the first blockchain smartphones: What you need to know

Sirin blockchain phones
It appears the world's third-biggest handset maker may win a race to become the industry's first to offer a blockchain smartphone; Swiss-based Sirin Labs announced its own $1,000 smartphone and $800 all-in-one PC with native blockchain capabilities last October; it scheduled the release for this September, according to reports. HTC, however, plans to release its phone this quarter. HTC's blockchain phone has already received "tens of thousands" of reservations globally, Phil Chen, the chief crypto officer at HTC, said in an interview during the RISE conference in Hong Kong this week. Like HTC's upcoming $1,000 Exodus blockchain smartphone, Sirin's Finney smartphone will come with a built-in cold-storage crypto wallet for storing bitcoin, Ethereum and other digital tokens, and it will run on open-source, feeless blockchain. Sirin was able to raise more than $100 million in an initial coin offering for the Android-based Finney smartphone and PC. Both will run Sirin's open-source operating system, SIRIN OS.


Apache Mesos and Kafka Streams for Highly Scalable Microservices

Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications or frameworks. It sits between the application layer and the operating system. This makes it easy and efficient to deploy and manage applications in large-scale clustered environments. Apache Mesos abstracts away data center resources to make it easy to deploy and manage distributed applications and systems. DC/OS is a Mesosphere-backed framework on top of Apache Mesos. As a datacenter operating system, DC/OS is itself a distributed system, a cluster manager, a container platform, and an operating system. DC/OS has evolved a lot in the past couple of years, and supports new technologies like Docker as its container runtime or Kubernetes as its orchestration framework. As you can imagine from this high-level description, DC/OS is an first-class choice infrastructure to realize a scalable microservice infrastructure.


Your Roadmap to an Open Mobile Application Development Strategy


An MADP allows a business to rapidly build, test and deploy mobile apps for smartphones and tablets. It can minimize the need for coding, integrate building-block services, such as user management, data management and push notifications, and deliver apps across a broad array of mobile devices. The result is a common and consistent approach, so developers can customize their apps without worrying about back-end systems or implementation details. Michael Facemire, principal analyst at Forrester Research, observed in an analysis of Mobile Development Platforms that companies fall into two camps: “those that prefer an all-inclusive platform”, who represent the greatest, though waning part of platform spend today, and “those that prefer to manage a collection of services”. The first group of customers work with large infrastructure vendors, such as IBM, Oracle, and SAP, who offer complete environments for development, delivery, and management of mobile applications. They benefit from platform stability and custom support, but may struggle compared with other platforms when building mobile experiences outside of their proprietary ecosystems.


Hacker-powered security is reaching critical mass

“Crowdsourced security testing is rapidly approaching critical mass, and ongoing adoption and uptake by buyers is expected to be rapid,” Gartner reported. Governments are leading the way with adoption globally. In the government sector there was a 125 percent increase year over year with new program launches including the European Commission and the Ministry of Defense Singapore, joining the U.S. Department of Defense on HackerOne. Proposed legislations like Hack the Department of Homeland Security Act, Hack Your State Department Act, Prevent Election Voting Act, and the Department of Justice Vulnerability Disclosure Framework further demonstrate public sector support for hacker-powered security. Industries beyond technology continued to increase share of the overall hacker-powered security markets. Consumer Goods, Financial Services & Insurance, Government, and Telecommunications account for 43 percent of today’s bug bounty programs. Automotive programs increased 50% in the past year and Telecommunications programs increased 71 percent.



Quote for the day:


"Experience without theory is blind, but theory without experience is mere intellectual play." -- Immanuel Kant


Daily Tech Digest - July 12, 2018

WAFs Should Do A Lot More Against Current Threats Than Covering OWASP Top 10
Organizations invest to increase network capacity, ultimately accommodating fictitious demand. Accurate distinction between human traffic and bot-based traffic, and between “good” bots (like search engines and price comparison services) and “bad” bots, can translate into substantial savings and an uptick in customer experience. The bots won’t make it easy on you now as they can mimic human behavior, bypass CAPTCHA and other challenges. Moreover, dynamic IP attacks render IP-based protection ineffective. Often times, open source dev tools (Phantom JS for instance) that can process client-side JavaScript are abused to launch brute-force, credential stuffing, DDoS and other automated bot attacks. To manage the bot-generated traffic effectively, a unique identification (like a fingerprint) of the source is required. Since bot attacks use multiple transactions, the fingerprint allows organizations to track suspicious activity, attribute violation scores and make an educated block/allow decision at a minimum false-positives rate.



Why your whole approach to security has to change imageNew technology philosophies like DevOps and Agile provide the opportunity to build security into the whole lifecycle that exists around IT use. By embedding proper security processes around cloud resources, companies can make their workflows deliver security into the fabric of this new architecture from the start. Getting this degree of oversight and security in place involves making security goals and objectives clear to everyone, while also enabling those processes to run smoothly and effectively. It involves making security management into more than just a blocker for poor software; instead, it is about making services available quickly within those workflows. This process is termed transparent orchestration. Transparent orchestration involves a re-wiring of security to match how this IT infrastructure has been rebuilt. As part of this, security must be automatically provisioned across a complete mix of internal and external networks, spanning everything from legacy data centre IT through to multi-cloud ecosystems and new container-based applications.



Top 3 practical considerations for AI adoption in the enterprise

Artificial intelligence computer brain circuits electronics grid
Explainable AI centers on the ability to answer the question, “Why?” Why did the machine make a specific decision? The reality is many new versions of AI that have emerged have an inherent notion of a “black box.” There are many inputs going into the box, and then out of it comes the actual decision or recommendation. However, when people try to unpack the box and figure out its logic, it becomes a major challenge. This can be tough in regulated markets, which require companies to disclose and explain the reasoning behind specific decisions. Further, the lack of explainable AI can affect the change management needed throughout the company to make AI implementations succeed. If people cannot trace an answer to an originating dataset or document, it can become a hard value proposition to staff. Implementing AI with traceability is a way to address this challenge. For example, commercial banks manage risk in an online portfolio. A bank may lend money to 5,000 small- to medium-sized businesses. It will monitor their health in balance sheets within the portfolio of loans. These sheets may be in different languages or have different accounting standards.



Blockchain as a Re-invention of the Business
The current model is very much authority-centric, which leaves a narrow space to individuals and small legal entities who are mere spectators or limited contributors. If we take into account the democratization of choice that boosts up the people’s motivation nowadays, we can come up to the conclusion that the current approach stands against the right to self-empowerment. Blockchain, a wonderful combination of mathematics and technology made it possible to distribute the power into the nations where it actually belongs. “With great power comes great responsibility," as the Marvel comics super-heroes use to say. This could be a motto of DLT. Blockchain is a natural service area that hooks up to the Internet as the connectivity layer. Business globalization and economic freedom are two main forces of a paramount significance underpinning the evolution of the distributed transactional platform. The central system played an absolutely vital role in times of corruption and global disorders or wars. In the current reality, people deserve to operate within a planetary technological network.


A Look at the Technology Behind Microsoft's AI Surge

Lambda architecture, while a general computing concept, is built into the design of Microsoft's IoT platform. The design pattern here focuses on managing large volumes of data by splitting it into two paths -- the speed path and the batch path. The speed path offers real-time querying and alerting, while the batch path is designed for larger data analysis. While not all AI scenarios use both of these paths, this is a very common edge computing pattern. At the speed layer, Azure offers two main options -- Microsoft's own Azure Stream Analytics offering and the open source Apache Kafka, which can be implemented using the HDInsight Hadoop as a Service (HDaaS) offering, or on customers' own virtual machines (VMs). Both Stream Analytics and Kafka offer their own streaming query engines (Steam Analytics' engine is based off of T-SQL). Additionally, Microsoft offers Azure IoT and Azure Event Hubs, which connect edge devices (such as sensors) to the rest of the architecture. IoT Hubs offer a more robust solution with better security; Event Hubs are specifically designed just for streaming Big Data from system to system.




U.S. regulators grappling with self-driving vehicle security
U.S. Transportation Secretary Elaine Chao said in San Francisco on Tuesday that “one thing is certain — the autonomous revolution is coming. And as government regulators, it is our responsibility to understand it and help prepare for it.” She said “experts believe AVs can self-report crashes and provide data that could improve response to emergency situations.” One issue is if self-driving vehicles should be required to be accessible to all disabled individuals, including the blind, the report noted. The Transportation Department is expected to release updated autonomous vehicle guidance later this summer that could address some of the issues raised during the meetings. Automakers, Waymo, a unit of Alphabet Inc, and other participants in the nascent autonomous vehicle industry have called for federal rules to avoid a patchwork of state regulation. However, the process of developing a federal legal framework for such vehicles is slow moving.


The rise of artificial intelligence DDoS attacks
ddos attack
The major turning point in the evolution of DDoS came with the automatic spreading of malware. Malware is a phrase you hear a lot of and is a term used to describe malicious software. The automatic spreading of malware represented the major route for automation and marked the first phase of fully automated DDoS attacks. Now, we could increase the distribution and schedule attacks without human intervention. Malware could automatically infect thousands of hosts and apply laterally movement techniques infecting one network segment to another. Moving from network segments is known as beacheading and malware could beachhead from one part of the world to another. There was still one drawback. And for the bad actor, it was a major drawback. The environment was still static, never dynamically changing signatures based on responses from the defense side. The botnets were not variable by behavior. They were ordered by the C&C servers to sleep and wake up with no mind for themselves. As I said, there is only so much bandwidth out there. So, these type of network attacks started to become less effective.



Automation could lift insurance revenue by $243 billion
First, explaining the vision clearly and securing leadership buy-in. “By establishing a clear and compelling vision, organizations demonstrate that intelligent automation is a strategic imperative and are able to answer critical questions,” the report says. Second, developing a clear pilot process. “The automation business case will need to assess the impact on transaction processing time and employee time saved and consider variables such as the volume of transactions or the number of exceptions in a specific process,” according to Capgemini. Firms should also consider starting with “low-hanging fruit” and engaging talent through hackathons and accelerators, the report says. Third, scaling up with an automation center of excellence. “To promote effective collaboration with the CoE, organizations should consider incentivizing functions based on business benefits derived from implementation of intelligent automation,” Capgemini suggests. Fourth, industrializing automation.


In-memory computing: enabling continuous learning for the digital enterprise
speech bubble constructed from abstract letters
Today’s in-memory computing platforms are deployed on a cluster of servers that can be on-premises, in the cloud, or in a hybrid environment. The platforms leverage the cluster’s total available memory and CPU power to accelerate data processing while providing horizontal scalability, high availability, and ACID transactions with distributed SQL. When implemented as an in-memory data grid, the platform can be easily inserted between the application and data layers of existing applications. In-memory databases are also available for new applications or when initiating a complete rearchitecting of an existing application. The in-memory computing platform also includes streaming analytics to manage the complexity around dataflow and event processing. This allows users to query active data without impacting transactional performance. This design also reduces infrastructure costs by eliminating the need to maintain separate OLTP and OLAP systems.



Hospital Diverts Ambulances Due to Ransomware Attack
The ransomware attack Monday impacted the enterprise IT infrastructure, including the electronic health records system, at Harrisonville, Mo.-based Cass Regional Medical Center, which includes 35 inpatient beds and several outpatient clinics, a spokeswoman tells Information Security Media Group. As of Wednesday morning, about 70 percent of Cass' affected systems were restored, she says. Except for diverting urgent stroke and trauma patients to other hospitals "out of precaution," Cass Regional has continued to provide inpatient and outpatient services for less urgent situations as it recovers from the attack, she says. "We've gone to our downtime processes," she says, which include resorting to the use of paper records while the hospital's Meditech EHR system is offline during the restoration and forensics investigation, she says. The hospital is working with an unnamed international computer forensics firm to decrypt data in its systems, she adds, declining to disclose the type of ransomware involved in the attack or whether the hospital paid a ransom to obtain a decryption key from extortionists.


Quote for the day:

"The mediocre leader tells. The good leader explains. The superior leader demonstrates. The great leader inspires." -- Gary Patton

Daily Tech Digest - July 11, 2018

Georgia Tech report outlines the future of smart cities

Georgia Tech report outlines the future of smart cities
One key point researchers made is that IoT deployed in public spaces – in collaboration between city governments, private enterprise and citizens themselves – has a diverse group of stakeholders to answer to. Citizens require transparency and rigorous security and privacy protections, in order to be assured that they can use the technology safely and have a clear understanding of the way their information can be used by the system. The research also drilled down into several specific use cases for smart city IoT, most of which revolve around engaging more directly with citizens. Municipal services management offerings, which allow residents to communicate directly with the city about their waste management or utility needs, were high on the list of potential use cases, along with management technology for the utilities themselves, letting cities manage the electrical grid and water system in a more centralized way. Public safety was another key use case – for example, the idea of using IoT sensors to provide more accurate information to first responders in case of emergency.



10 Tips for Managing Cloud Costs

Part of the reason why cost management is so challenging is because organizations are spending a lot of money on public cloud services. More than half of enterprises (52%) told RightScale that they spend more than $1.2 million per year on clouds services, and more than a quarter (26%) spend over $6 million. That spending will likely be much higher next year, as 71% of enterprises plan to increase cloud spending by at least 20%, while 20% expect to double their current cloud expenditures. Given those numbers, it's unsurprising that Gartner is forecasting that worldwide public cloud spending will "grow 21.4% in 2018 to total $186.4 billion, up from $153.5 billion in 2017." Another problem that contributes to cloud cost management challenges is the difficulty organizations have tracking and forecasting usage. The survey conducted for the SoftwareONE Managing and Understanding On-Premises and Cloud Spend report found that unpredictable budget costs was one of the biggest cloud management pain points for 37% of respondents, while 30% had difficulty with lack of transparency and visibility.


How to Receive a Clean SOC 2 Report

How to Receive a Clean SOC Report
Having a documented control matrix will be beneficial for more than just compliance initiatives; it becomes your source for how risk controls are developed and implemented and can be useful for augmenting corporate information security policies. For SOC 2, the control matrix becomes an important reference document for auditors. For instance, Trust Services Criteria 4 relate to monitoring of controls, so creating a list of how your organization is confirming controls are well designed and operating effectively makes it easy for auditors to validate that your stated controls are in place, designed to meet your security and confidentiality commitments, and are effective in doing so. Here is a concrete example: A control in your environment says servers need to be hardened to CIS benchmarks. How are you evaluating the effectiveness of this control? Are the servers hardened to your specification before going into production? Are they meeting benchmarks on an ongoing basis? An easy way to meet the monitoring requirement is to use a tool like Tripwire Enterprise.


Most Enterprise of Things initiatives are a waste of money

Most Enterprise of Things initiatives are a waste of money
What’s truly needed is a consolidated ability to capture and process all of the data and convert it into meaningful insights. Many companies provide analytics engines to do this (e.g., SAP, Google, Oracle, Microsoft, IBM, etc.). But to have truly meaningful company-wide analysis, a significantly more robust solution is needed than stand-alone, singular instances of business intelligence/analytics. How should companies enable the full benefits of EoT? They need a strategy that provides truly meaningful “actionable intelligence” from all of the various data sources, not just the 15 to 25 percent that is currently analyzed. That data must be integrated into a consolidated (although it may be distributed) data analysis engine that ties closely into corporate backend systems, such as ERP, sales and order processing, service management, etc. It’s only through a tightly integrated approach that the maximum benefits of EoT can be accomplished. Many current back-office vendors are attempting to make it easier for companies to accomplish this. Indeed, SAP is building a platform to integrate EoT data into its core ERP offerings with its Leonardo initiative.


Randy Shoup Discusses High Performing Teams

It is estimated that the intelligence produced by Bletchley Park, code-named "Ultra", ended the war two years early, and saved 14 million lives. ... Although the Bletchley Park work fell under the domain of the military, there was very little hierarchy, and the organisational style was open. The decryption was conducted using a pipeline approach, with separate "huts" (physical buildings on the campus) performing each stage of intercept, decryption, cataloguing and analysis, and dissemination. There was deep cross-functional collaboration within a hut, but extreme secrecy between each of them. There was a constant need for iteration and refinement of techniques to respond to newer Enigma machines and procedures, and even though the work was conducted under an environment of constant pressure the code-breakers were encouraged to take two-week research sabbaticals to improve methods and procedures. There was also a log book for anyone to propose improvements, and potential improvements were discussed every two weeks.


Intuit's CDO talks complex AI project to improve finances


The most obvious is a chat bot, but it could also provide augmented intelligence for our customer care representative; it could provide augmented intelligence for our accountants who are working on Intuit's behalf or private accountants who are using Intuit software. It could be deployed in internal processes where product teams learn how people interact with our software through a set of focus groups. So, it's one technology that could be instantiated across many different platforms and touchpoints. That's one of the exciting aspects from a technology perspective. If you think about how a human works, there are so many things that are amazing about humans, but one is that they have the ability to rapidly change contexts and rapidly deal with a changing environment. The touchpoint doesn't matter. It doesn't matter if you're talking on video, on the phone or in person. Generally speaking, people can deal with these channels of communication very easily. But it's hard for technology to do that. Technology tends to be built for a specific channel and optimized for that channel.


Ethereum is Built for Software Developers

In part, this is all thanks to what Ethereum has accomplished in a very short period. We give too much credit to Bitcoin’s price that skyrocketed approaching $20,000 in December, 2017, but the reality is in the code, and Ethereum is now what all dApp platforms compare themselves with, not the decade old Bitcoin model. As Ethereum solves the scalability problem, it will effectively untether itself from Bitcoin’s speculative price volatility. If Bitcoin is a bet, Ethereum is a sure thing. The main reason that is is because of the developer community it has attracted and the wide range of startups that use it especially in the early phases of their development. As TRON might find out, once they go independent they may have a more difficult time attracting software developers. Ethereum must take the piggy-back of ICOs of 2017 and be the open-source public distributed world operating system it was designed to be. It has massive potential to fill and in a crypto vacuum of hype and declining prices, Ethereum is perhaps the last chance before 2020, as the Chinese blockchains take over. The window is disappearing guys.


Software Flaws: Why Is Patching So Hard?

Software Flaws: Why Is Patching So Hard?
"As OCR states, identifying all vulnerabilities in software is not an easy process, particularly for the end user or consumer," says Mac McMillan, CEO of security consultancy CynergisTek. Among the most difficult vulnerabilities to identify and patch in healthcare environments "are those associated with software or devices of a clinical nature being used directly with patients," he says. "There are many issues that make this a challenge, including operational factors like having to take the system off line or out of production long enough to address security. Hospitals don't typically stop operations because a patch comes out. The more difficult problems are ones associated with the vulnerability in the software code itself, where a patch will not work, but a rewrite is necessary. When that occurs, the consumer is usually at a disadvantage." Fricke says applications that a vendor has not bothered to keep current are the trickiest to patch. "Some vendors may require the use of outdated operating systems or web browsers because their software has not been updated to be compatible with newer versions of operating systems or web browsers," he says.


5 security strategies that can cripple an organization

Security teams today have a two-faceted information problem: siloed data and a lack of knowledge. The first issue stems from the fact that many companies are only protecting a small percentage of their applications and, therefore, have a siloed view of the attacks coming their way. Most organizations prioritize sensitive, highly critical applications at the cost of lower tier apps, but hackers are increasingly targeting the latter and exploiting them for reconnaissance and often much more. It’s amazing how exposed many companies are via relatively innocuous tier 2 and legacy applications. The second, and more significant issue, can be summarized simply as, “you don’t know what you don’t know.” IT has visibility into straightforward metrics, but it often lacks insight into the sophistication of attempted breaches, how their risk compares to peers and the broader marketplace, and other trends and key details about incoming attack traffic. With visibility to only a small percentage of the attack surface, it’s very difficult to know whether the company is being targeted and exploited. Given the resource challenges noted above, it’s unrealistic to attempt to solve this problem with manpower alone.


What's the future of server virtualization?

Prior to server virtualization, enterprises dealt with server sprawl, with underutilized compute power, with soaring energy bills, with manual processes and with general inefficiency and inflexibility in their data-center environments. Server virtualization changed all that and has been widely adopted. In fact, it’s hard to find an enterprise today that isn’t already running most of its workloads in a VM environment. But, as we know, no technology is immune to being knocked off its perch by the next big thing. In the case of server virtualization, the next big thing is going small. Server virtualization took a physical device and sliced it up, allowing multiple operating systems and multiple full-blown applications to draw on the underlying compute power.  In the next wave of computing, developers are slicing applications into smaller microservices which run in lightweight containers, and also experimenting with serverless computing (also known as function-as-a-service (FaaS). In both of these scenarios, the VM is bypassed altogether and code runs on bare metal.



Quote for the day:


"The simple things are also the most extraordinary things, and only the wise can see them." -- Paulo Coelho