Daily Tech Digest - December 11, 2018

Using a password manager: 7 pros and cons

login password - user permissions - administrative control
NIST SP 800-63 recommends using non-password methods where possible, and although the recommendations are definitely against forcing users to use very long and complex passwords, they don’t limit password length or complexity. When people are forced to create and use long, complex, and frequently changing passwords, they do a poor job at it. They reuse the same passwords among different websites or use only slightly different passwords, which create an easy-to-decipher pattern. If those same humans use MFA or other non-memorization authentication methods, then the overall risk of repeated passwords and patterns can be broken. If a person can use a password manager, which creates and uses long and complex passwords that the person doesn’t have to remember, then perhaps you can get the best of both worlds. Until recently, I had never completely depended on them, throwing all my memorized passwords away. I felt bad about recommending them without “living” with them. 



Facebook Filed A Patent To Calculate Your Future Location

Another Facebook patent application titled “Location Prediction Using Wireless Signals on Online Social Networks” describes how tracking the strength of Wi-Fi, Bluetooth, cellular, and near-field communication (NFC) signals could be used to estimate your current location, in order to anticipate where you will go next. This “background signal” information is used as an alternative to GPS because, as the patent describes, it may provide “the advantage of more accurately or precisely determining a geographic location of a user.” The technology could learn the category of your current location (e.g., bar or gym), the time of your visit to the location, the hours that entity is open, and the popular hours of the entity. For example, in the map below that demonstrates how the tech would work, Facebook would see that you are in geographic location 302 — and it could predict you’d be likely to go to locations 304, 306, and 308 next, based on places you’ve visited before (maybe you’ve gone to Starbucks after visiting Walgreens) or on the travel behavior of other users the same age as you.


Be Prepared for Disruption: Thinking the New Unthinkables


The main conclusion is that the conformity — defined as adhering to conventional wisdom — that gets leaders to the top too often disqualifies them from grasping the scale and nature of disruption. Leaders are saddled with what Geoff Mulgan, chief executive of Nesta, a global innovation foundation in the United Kingdom, labels “zombie orthodoxies.” These leaders rise through the ranks listening and conforming to those like them. But disruption requires precisely the opposite: It needs leaders to think, and plan for, unthinkables. In order to do this, it is imperative to have a clear purpose and to embrace diversity, inclusivity, and new behaviors, which will help leaders understand and even anticipate the impact of disruption. It is an enormous Rubik’s Cube. As one top professional told us: Leaders today confront having to “eat an elephant in one mouthful.” This is not a case of trying to break down today's challenges into neat solutions.


IT strategy: How to be an influential digital leader

Like von Schirmeister, Gideon Kay -- who is European CIO at Dentsu Aegis Network -- says IT leaders must be alert to the fact that people on the board increasingly have a take on technology, just like they would on sales, marketing and operations. Kay says CIOs must see this new interest in digital transformation as an opportunity to influence. "You don't have to bite your lip," he says. "Once you've built your credibility, which you need to do pretty quickly, and providing you've built a reputation for explaining technology in the right way -- which is about talking in terms of the business and commercial impact -- then you can give the business the definitive line on technology." Kays says CIOs can use their experience to say which services the business should be worried about, and which are the ones that don't matter: "These are the things that are hot, and these are the things that are not," he explains.


How to tame enterprise communications services

How to tame enterprise communications services
Having an organization-wide communications policy in concert with both organizational objectives and IT capabilities is a first step, just as is the case with BYOD and security. Solutions must similarly be in concert with this policy, and with no exceptions. Once the communications policy is in place, a solution set can be assembled and aligned with the general framework we introduced above. In general, the process here will follow that which is typically applied to all IT services, including a requirements analysis, service set definition, long and short lists of candidate products and services (and, increasingly rarely, new internal development), and experiential analysis and evaluation via alpha and beta tests. The rollout of the solution must be accompanied by consciousness-raising, education, support, and monitoring for management visibility with respect to both the policy and the solution. Once again, IT must reinforce the importance of using only approved channels and facilities and avoiding difficult-to-impossible-to-monitor out-of-band solutions, including social media.



Is Blockchain A Solution For Securing Centralized ID Databases?

Clearly, the way that some centralized identity databases are currently secured doesn't work. I believe that technology industry professionals should think outside the box to create a security solution for centralized databases. Some think blockchain is the answer. They believe that a distributed ledger could be used to decentralize identity information. Using the blockchain, identity information could be stored securely using cryptography. This is similar to how cryptocurrencies are cryptographically stored in wallets on the blockchain. A wide variety of identity documents could be stored on the blockchain in a single place — an identity wallet of sorts — and each wallet could have its own form of encryption. The main advantage of doing this is that the identity information would become decentralized on a distributed ledger. This would make it a lot harder for cybercriminals to perform large-scale identity data breaches because they would have to hack into each wallet individually.


IT pros look to iPaaS tools for LOB integration demands


Application automation and integration are central to nearly every project these days at Wilbur-Ellis, a $3 billion holding company, with divisions in agribusiness, chemicals and feed. "If I look back on the last three major projects, they all involve a separate system that has to integrate," said Dan Willey, CIO at the San Francisco-based company. Many of these iPaaS tools are conceptually good for modern, cloud-based companies, but sometimes you are saddled with an application that doesn't play well. In the case of Wilbur-Ellis, an ERP system by Oracle's JD Edwards is a stumbling block, Willey said. Wilbur-Ellis uses Dell Boomi's connectors to connect customer and order data. The company will also use the tool in a broader sense as an API management platform. "It's a hard problem to solve," Willey said. "It's interchanging between your tool sets, data in your back-end systems, front-end systems, IoT data and other things that need to be lined up to make it happen."


CrowdStrike: More Organizations Now Self-Detect Their Own Cyberattacks

Three-quarters of enterprises this year discovered on their own they had been hacked rather than learning from a third party. The bad news: It took them an average of 85 days to spot an attack. That means hackers still have the upper hand. What's more, they only need less than two hours, on average, to move from the initially attacked machine to further inside a target's network, according to CrowdStrike, which today published its "Cyber Intrusion Services Casebook, 2018," a report on a sampling of its real-world incident response (IR) investigations for clients. "We noticed attackers this year were pretty brazen and stealthy: Eighty-six days [before getting discovered] is still a problem," even when victim organizations are getting better at self-detection, says Tom Etheridge, vice president of services for CrowdStrike. The number of hacked organizations that spotted their own attacks rose 7% this year over those from CrowdStrike Services' IR engagements in 2017.


The top skills needed by data scientists in 2019

The data analyst role is suited to most businesses. Able to convert business challenges into opportunities for data analysis, the analyst often bridges the gap between technical and practical. A machine learning engineer is looking to make an algorithm run quickly and in a distributed environment. Asking them to analyze data and find nuggets of relevant business insights isn’t their forte, but an ML engineer can select the appropriate algorithm and implement it within the company’s production system without introducing a bottleneck. A research data scientist is interested in investigating cutting-edge techniques or inventing new techniques. This role usually requires a Ph.D. Extreme familiarity with the underlying mathematics is a must. It’s important to note this type of individual contributor would be bored out of their mind working on everyday-business problems. The manager is the ultimate bridge between various technical roles, business stakeholders, and other leadership. Managers are frequently facilitating their teams’ best work while ensuring outcomes are mapped to business goals and prove ROI.


Satan Ransomware Variant Exploits 10 Server-Side Flaws

"There is a risk of extensive infections because [of the] big arsenal of vulnerabilities that [the malware] attempts to exploit," says Apostolos Giannakidis, security architect at Waratek, which also posted a blog on the threat. All of the vulnerabilities are easy to exploit, and actual exploits are publicly available for many of them that allow attackers to compromise vulnerable systems with little to no customization required, he says. Several of the vulnerabilities used by Lucky were disclosed just a few months ago, which means that the risk of infection is big for organizations that have not yet patched their systems, Giannakidis says. All but one of the server-side vulnerabilities that Lucky uses affect Java server apps. "The vulnerabilities that affect JBoss, Tomcat, WebLogic, Apache Struts 2, and Spring Data Commons are all remote code execution vulnerabilities that allow attackers to easily execute OS commands on any platform," he notes.



Quote for the day:



"Colors fade, temples crumble, empires fall, but wise words endure." -- Edward Thorndike


Daily Tech Digest - December 10, 2018

What is an SSD? How solid state drives work

ssd computer chip solid state device
A simple USB flash drive (or thumb drive) is an example of solid-state drive technology. An SSD is a larger, more complex device that aggregates pools of NAND flash storage, the type of storage also found in MP3 players and digital cameras. Unlike RAM, which doesn’t retain data when the machine shuts off, SSD flash memory is non-volatile, which means data is retained whether the device is powered on or not. With SSDs, every block of data is accessible at the same speed as every other block, no matter the location. This makes SSDs inherently faster than hard drives, where platters are spinning and drive heads are moving to the right location. With HDDs, large files can be broken up and tucked into unused nooks and crannies of the drive, and data can be easily updated in place. This allows for efficient use of the total drive capacity. On the other hand, scattered data obviously takes longer to locate, which is why defragmenting a hard drive has become a standard part of device maintenance.


Online Shopping – Not so old but worth much more!

Ai in eCommerce Researchers or Explorers Trootech Business Solutions
People under the Researcher category exhibit that they have crossed the awareness stage are now into the consideration stage. They are considering their potential options. If a person is conducting research about anything, it means he/she is looking for details. Details that can satiate their confused mind, provide answers to their underlying questions. Researchers are a notch above browsers in terms of potential buyers. If a website provides a detailed explanation of products, trustworthy support, easy navigation, and crisp product images, they may more incline towards buying from that website. Hence trust is the most important factor for researchers. Simply providing consistent information and clear to the point product descriptions work here. Since trust is mentioned here, product reviews or user reviews become the most influential factor here. Researchers will tend to trust previous customers who have bought similar items. Comprehensive user reviews and ratings act as a catalyst.


Will the imminent death of Microsoft Edge lead to an insecure browser monoculture?

As Callan says, Microsoft certainly has the ability to adopt Google’s HTML rendering engine without abdicating other browser interface and technology decisions, but the possibility definitely looms of the new Microsoft browser becoming little more than a re-skinned version of Chrome, whatever it is called. "In the certificate space, this monoculture could lessen the influence of many important voices," Callan warns. "Interoperability requirements have caused public PKI mechanisms such as TLS certificates to be governed by standards bodies such as the IEFT and the CA/Browser Forum." And for good reason as these bodies incorporate the unique expertise of not only browser manufacturers but also CAs, information providers, auditing firms and others to create a robust ecosystem that defends against myriad attack vectors."As a single browser manufacturer gains the ability to flex its muscle and make decisions unilaterally that all others must comply with," Callan said, "these other viewpoints and their valuable knowledge threaten to be lost..."


New Microsoft Teams calling features narrow gap with Skype


Three of the advanced features -- group call pickup, call park and shared line appearance -- should become available within the next several weeks. A fourth feature, location-based routing, is slated for release in the first quarter of 2019. Group call pickup improves an existing feature that lets users automatically forward incoming calls to groups of colleagues. The system can ring each member of the group simultaneously or one at a time in a predetermined order. The update lets users customize the appearance and type of notifications that members of the group receive with incoming calls. Call park is a sophisticated way to put callers on hold. Parking a call generates a code, which gets sent -- in a text message, for example -- to the employee the caller is attempting to reach. That employee can then answer the call in the Teams app. Shared line appearance lets businesses create user accounts with multiple phone lines. The incoming calls to those lines are all automatically forwarded to other users. 


HONOR: The Force of Innovation

In pursuit of innovation, HONOR has often played the long game, ignoring the latest hot trends if it believed certain technologies were not ready to be commercialized. “Virtual reality and augmented reality, for instance, still have unsolved concerns and matters that need to be addressed,” Zhao said. “Blockchain application on mobile phones is nothing more than a gimmick. At HONOR, we have our own mechanism for deciding what to invest in and innovate. Once we have set the direction, we will go all-in.” That determination has come to fruition in HONOR’s AI breakthroughs, which have been at the center of HONOR’s development strategy from the beginning. Long before the industry came to realize the importance of AI, HONOR had already invested heavily in R&D and building the AI ecosystem. “We have spent six years developing the chipset, system and application,” Zhao said. “We believe AI assists human brains in making decisions. If mobile phones have broadened our minds and experiences as users, then AI will narrow the gap between us and the experts.


What is digital trust? How CSOs can help drive business

security trust
Even if companies understand the value of trust, many simply overestimate their own standing in their customers eyes and how they compare to the competition. The report outlined an average of a 14-point gap between the level of trust customers have in whether organizations handle personal data appropriately compared to how much organizations think they are trusted. The report claims this illustrates how “dangerously out of touch” organizations are with their customers. Just a third of customers said their trust in organizations had increased over the last two years, compared to the 84 percent of business leaders who believe that trust has increased. Ninety percent of those business leaders claim they are very good or excellent at protecting customer data, and 93 percent say that it is a differentiator over the competition. Considering the number of organizations that admitted a data breach in the study, this clearly does not add up.


Quantum computers pose a security threat that we’re still totally unprepared for


The report cites an example of encryption that protects the process of swapping identical digital keys between two parties, who use them to decrypt secure messages sent to one another. A powerful quantum computer could crack RSA-1024, a popular algorithmic defense for this process, in less than a day. Such machines, which would require a couple of thousand “logical” qubits, are probably at least a decade away, say the US experts. Qubits’ delicate quantum state can be disrupted by things like tiny changes in temperature or very slight vibrations, so it can require thousands of linked qubits to produce a single logical one that can be reliably used for computation. Still, complacency would be a mistake. William Oliver, an MIT physics professor and a member of the group that produced the academies’ report, notes that governments and businesses like banks often need to keep data secure for decades. They therefore need to be thinking now about potential future threats to the encryption they’re using.


Brexit implications for data protection


GDPR brought massive changes to data protection legislation and expanded what was expected of data controllers and data processors. One of the biggest changes brought about by GDPR is that organisations must now have technical measures that enforce their data sharing policy. “You used to have a contract that said you would not misuse data, but GDPR says you now must have technology in place that prevents the misuse of data,” says Gary Lefever, CEO of Anonos. As the internet has become an integral part of our society, data protection has become a legislative necessity to ensure the sharing of personal information is conducted in a fair, secure and responsible manner. The requirements for data storage, sharing and processing have been articulated in the GDPR, which is necessary reading for any company with any form of online presence. According to both the GDPR and the UK Data Protection Act 2018, when a country leaves the EU, it will cease to be covered by the GDPR, and as such will be considered as a third country, which is any country or territory other than an EU member state.


Innovative anti-phishing app comes to iPhones

Apple, iOS, iPhone, iPad, security, MetaCert
Traditional security protection systems such as virus checkers and firewalls are still mandatory, but they are far less effective against the complex attack scenarios prevalent in today's digital economy. When it comes to enterprise security, network monitoring, location-based protection and cooperative sharing of security-related datasets are becoming key components of switched-on, 24/7, situation-awareness security protection systems. Within this landscape, MetaCert’s system seems a useful adjunct to existing systems. I imagine we’ll see this kind of alert-based security systems become components of future operating systems in the future, certainly within those from vendors that actually care about customer security, and privacy, come to that. On iOS, this new solution works with most email services, including Thunderbird and Apple Mail, with Outlook and Gmail support in development. The company is running a public beta test, so you can test this system for yourself.


Juniper ATP appliance gets broader device support

In the latest announcement, Juniper has made it possible for a Juniper ATP appliance to collect logs from security devices through their system logging facility, or syslog. To separate security-related log events, an administrator would go to the JATP user interface and create a log filter using one of its supported formats, such as XML, JSON and CSV. The admin can then create a parser that maps the log field from a firewall, for example, into the JATP's event fields to look for possible security threats, a Juniper spokesperson said in an email. Also, through the JATP UI, admins can see statistics on incoming logs and delete unneeded logs. Before the syslog support, easily collecting security data was limited to Juniper's SRX firewalls and devices made by the company's partners, such as Fortinet and Palo Alto Networks. "With the custom data collector capability, the integrations can be created from within the product by security personnel via an easy-to-use UI," the spokesperson said. Juniper plans to release the new capabilities in a software upgrade scheduled for release this month.



Quote for the day:


"Setting an example is not the main means of influencing others, it is the only means." -- Albert Einstein


Daily Tech Digest - December 09, 2018

The India Enterprise Architecture Framework, aptly called IndEA, was formally notified by the Government of India as a national standard. Needless to mention, for me the journey from development to notification has been challenging and extremely fulfilling at the same time, both at a professional and personal level. Ever since the success of ePragati in Andhra Pradesh, many Ministries, States, Departments, and other public sector entities have shown growing interest in Enterprise Architecture. The elevation and notification of IndEA as a national standard is absolutely a move in the right direction to accentuate and accelerate such activities. The rise of IndEA as a framework will trigger a flurry of activities across the entire ecosystem from consulting firms, system integrators, accredited training providers, to the professional architect community-at-large.


face_id_malware
“Ultimately, the goal of the FRP [Facial Recognition Pilot] is to identify if facial recognition technologies can be of assistance to the USSS in identifying known subjects of interest prior to initial contact with law enforcement at the White House Complex," it says. At the moment, the FRP is restricted to only identifying staff that have volunteered to test the technology. Cameras have been set up in two locations that will capture images of people walking along the public streets and parks next to the White House. The images of the faces captured in the video feed will be matched against a database containing the faces of the volunteers. Only pictures that correspond to a positive match will be kept, negative matches will be deleted automatically. A match will send an alert to the USSS, where it will be confirmed by staff. DHS said it would be transparent about its process and has provided a public notice and written notice to volunteers. After the tests have been carried out, the faces kept on the database will be deleted.


Blockchain – Not Just for Bitcoin
Transactions made via blockchain technology cannot be manipulated, hidden or hacked, as the technology is not controlled by centralized financial institution or governments. In order for its authenticity to be compromised, almost the entire network of hundreds of thousands of individually operated nodes, distributed all across the world, would need to be coordinated and altered at the same time. Additionally, the encryption technology feature of blockchain ensures both privacy and verifiability simultaneously on the public audit trail. The Feb-2018 McAfee Report, estimates that cybercrime is on course to costing the world $600 billion  in 2018. ... While existing financial services providers have enjoyed an oligarchy over the secure payments industry for over a decade, the technological mechanisms they employ have remained essentially the same during that period. This lack of development has made them vulnerable to cybercrime, which drives up the regulatory costs and is reflected in the consumer’s fees.


Code Churn — A Magical Metric for Software Quality

While designing test plans, the basic question to ask would be to find out what code changes are done in the software. There are multiple contexts to this; one is from the functional perspective and another from a code perspective. The functional perspective would help to do black box testing. If you understand the functionality well and what has been changed in the current feature, then it would be easy to come up with an effective test plan to test the modified functionality. However, in legacy or enterprise level software, it is not sufficient. The code written to change or add a functionality would have many side effects, like: Direct dependent code; Reused code; and Business rules changing, especially various code paths. In such cases, black box testing would not be enough. It is important to know the code churn


Why the fast pace of IT is outpacing project management


The success of agile methods taught us that software development processes are better when they are adaptable to variables such as the ever-changing technology landscape or the need to pivot around product-market fit. The fact that project management is suited to solving well-defined problems causes some major disconnects. For example, the uncertain nature of a new software product could mean that long projects are padded with such large time and cost buffers that the market and technology landscape will change significantly during the project’s delivery. In that scenario, it renders the assumptions made at the start of the project useless. Because it’s closely tied into business inputs, budgeting is one of the first places that you see the breakdown when trying to manage software delivery with a project-oriented approach. By design, project budgets need to bake in all of the uncertainty and risk of a software project.


The 6 reasons why Huawei gives the US and its allies security nightmares


Huawei claims its equipment connects over a third of the world’s population. It’s also handling vast amounts of data for businesses. That’s why there’s fear in Western intelligence circles that back doors could be used to tap into sensitive information using the firm’s equipment. This would be tricky to do undetected, but not impossible. Huawei doesn’t just build equipment; it can also connect to it wirelessly to issue upgrades and patches to fix bugs. There’s concern that this remote connectivity could be exploited by Chinese cyber spies. ... Huawei has repeatedly stressed it’s a private company that’s owned by its employees. The implication is that it has no incentive to cause customers to lose confidence in the integrity of its products. On the other hand, its governance structures are still something of a mystery, and its founder, Ren Zhengfei, who was once an officer in the Chinese People’s Liberation Army, keeps a low profile. Such things “make you question just how much independence it really has,” says Adam Segal, a cybersecurity expert at the Council on Foreign Relations in New York.


Understanding the Potential of Artificial Intelligence

If your computer system is not making a decision and then learning whether that decision was good or bad and adapting its own internal model of the world, I would argue that it’s not true AI. And it’s OK for companies at the moment to be calling machine learning AI. So for me, the true [definition] of AI [involves] systems that can learn and adapt themselves without the aid of a human. Adaptability is synonymous with intelligence. In fact, most companies don’t have machine learning problems — they have optimization problems. Optimization is the process of allocating resources to achieve an objective, subject to some constraints. Optimization problems are exceptionally hard to solve. For example, how should I route my vehicles to minimize travel time, or how do I allocate staff to maximize utilization, or how do I spend marketing money to maximize impact, or how do I allocate sales staff to opportunities to maximize yield?


How blockchain is bringing staffing innovation

How blockchain is bringing staffing innovation
Blockchain is already poised to seriously disrupt the staffing industry, for instance, because it’s helping companies expedite resume validation. Similarly, it’s easier now than ever before for many businesses to churn out short-term contracts, which has enabled them to prosper from the rise of freelancers in the international economy. Where businesses were once relatively constrained in their staffing practices, blockchain technology and the development of the gig economy has made it easy for HR departments to remain flexible with short-term hires. The decentralized nature of blockchain ledgers make them ideal for the contemporary economy because they can’t be monopolized by anyone; blockchain by its very nature is democratic, meaning it can be of use to anyone. Massive corporations and small-town entrepreneurs alike are already beginning to realize that the can both use blockchain applications at the same time precisely because its costs are so affordable and its advantages so universal.


The Analytics Platform seen through Enterprise Architecture


Data analytics and AI is now on the agenda of every organisation. After years of being the back-room preserve of analysts, it is now out in the open, in the boardroom and being proclaimed as central to business strategy and transformation. If this were not enough, the data analytics processes actually running in the organisation are no longer just reports or ad hoc queries by individual users but are now integrated with on-line transactional systems and enable business-critical activities. Analytics has made the transition from end-user computing to an enterprise capability requiring support and governance by IT. The journey has not always been easy with analysts and IT often experiencing a culture clash. All this has come to the attention of the people responsible for planning IT capabilities: the Enterprise Architects. Analytical systems now need a defined place in the Enterprise Architecture alongside transactional systems, CRM systems, data warehouses, communication systems and other core systems.


Reference Architecture & Frameworks: A Consolidation

The use of a reference architecture in an organization also requires governance to enforce that organizations commit to its use. Reference architectures are only valuable if people use them as intended and follow their guidance—otherwise the whole idea of reusing industry best practices breaks down. For example, the business functions of a typical insurance company are largely similar to those of its competitors, as are many of its business processes. Competitive differences will most likely be in its products, pricing, customer segments, and customer relationships. Reusing industry best practices provided by reference architectures ensures that you are not behind the curve on these non-competitive aspects. We also see this in the implementation of many IT systems, where vendors such as SAP provide reference processes for large parts of an organization. Reference architecture should be community-based.



Quote for the day:


"Your first and foremost job as a leader is to take charge of your own energy and then help to orchestrate the energy of those around you." -- Peter F. Drucker


Daily Tech Digest - December 08, 2018

HIPAA Compliance in a Containerized Environment
The HIPAA compliance framework is somewhat infamous for setting high-level requirements regarding healthcare data and privacy without recommending specific ways of meeting those requirements. That tendency leaves many developers, admins and DevOps engineers wondering how, exactly, to comply with HIPAA. If you find yourself in that situation, keep reading. This blog post identifies best practices for HIPAA compliance on one of today’s most common types of environments—those built with containers. It draws on the NIST’s Cybersecurity Framework recommendations for addressing the high-level compliance requirements that HIPAA includes. The tips below are drawn from Twistlock’s Guide to HIPAA Compliance for Containers, a 38-page resource that walks through best practices for HIPAA compliance in a containerized environment with the help of the Twistlock platform. 


Facial recognition has to be regulated to protect the public, says AI report


The report calls for the US government to take general steps to improve the regulation of this rapidly moving technology amid much debate over the privacy implications. “The implementation of AI systems is expanding rapidly, without adequate governance, oversight, or accountability regimes,” it says. The report suggests, for instance, extending the power of existing government bodies in order to regulate AI issues, including use of facial recognition: “Domains like health, education, criminal justice, and welfare all have their own histories, regulatory frameworks, and hazards.” It also calls for stronger consumer protections against misleading claims regarding AI; urges companies to waive trade-secret claims when the accountability of AI systems is at stake (when algorithms are being used to make critical decisions, for example); and asks that they govern themselves more responsibly when it comes to the use of AI.


Cyber risk management continues to grow more difficult

Cyber risk management continues to grow more difficult
Cyber risk management is significantly more difficult today than it was two years ago. That's according to new ESG research involving 340 enterprise cybersecurity, GRC, and IT professionals who were asked to compare cyber risk management today to two years ago. (Note: I am an employee of ESG.) The data indicates that 39 percent of survey respondents believe that cyber risk management is significantly more difficult today than it was two years ago, while another 34 percent say that cyber risk management is somewhat more difficult today than it was two years ago ... Think about this data from a CISO perspective. Your bosses are pushing you for more frequent updates on cyber risk management, and they want it presented in a business context. Meanwhile, your staff — which is likely incrementally bigger than it was two years ago, if at all — must collect, process, analyze, and report on risk management across from an increasing and vulnerable attack surface, being targeted by more sophisticated cyber-adversaries.


Best practices for Event Sourcing

In his talk, David gives an example of a service which performs 100 actions in 66 ms. This is an average latency of 0.66 ms. CMF is designed for consistently low latency where the focus is on the worst latencies the system sees. A key measure is often the 99.9%ile latency (worst 1 in 1,000) rather than the average or typical latencies. We recently helped a Tier 1 banking client build an Order Management System with 3 microservices where the wire to wire latency was under 20 micro-seconds 99.9% of the time for a throughput of 20,000 messages per second. Chronicle Decentred is designed for high throughput. Each chain can process a large number of messages across a cluster of servers e.g. 50K/s to 400K/s depending on the hardware. However, the latency is the time to achieve a consensus, which might be 5 ms to 500 ms depending on the network between them.


UK government commits to opening up data in bid to push adoption of AI


“The full benefits to society and the economy that can come from AI can only be realised if it is widely used,” said James. “That means government working together with industry to seize the prize of a reported additional £232bn on GDP by 2030.” However, despite today’s abundance of data, its potential is largely untapped due to much of it being locked in silos. Being a data-driven technology, this is a huge barrier in the further development of AI. “Data is a critical part of our national digital infrastructure and fundamental, of course, to AI,” said James. “Without access to good quality data from a range of sources, AI technologies cannot deliver on that promise of better, more efficient and seamless services. Government is really committed to opening up more data in a way that makes it reusable and easily accessible.” James said the government had already opened up more than 44,000 datasets, citing this as a major reason for the UK topping Oxford Insight’s Government AI Readiness Indexin 2017.


Industrial espionage fears arise over Chrome extension caught stealing browsing history

An extension collecting browsing history might sound benign, but in a phone call today, the ExtraHop team told ZDNet that this behavior is extremely worrisome when observed in this particular case. The ExtraHop team raised concerns that developers usually access URLs of internal networks, APIs, and applications, and whoever is collecting this browsing history will gain access to URLs that may reveal details about unreleased products, hidden features, or a company's intranet or internal network structure. For example, a developer making API calls to something like "/product/beta/car_dashboard/automatic_breaks/engage/pedestrian_detection/" may reveal quite a lot. In the hands of a determined attacker, such information is both valuable, as it could be sold to unethical competitors, but it could also be used to plan future attacks. The discovery of this extension comes on the heels of Netscout revealing that North Korean nation-state hackers have used a Chrome extension for the first time in a government-orchestrate cyber-espionage campaign.


The path to cloud security goes through integration

The path to cloud security goes through integration
First, establish a plan for how the security systems are going to talk. For the most part, this is a secure directory system, but there are common databases you can also use. Note that you will have to plan and coordinate across organizational silos. Second, find a security management and monitoring product that provides a “single pane of glass” between you and the security systems, both on-premises and in the cloud. This should be the single source of truth when it comes to who, what, when, how, and why. It’s kind of a mastermind for all enterprise security. Third, cross-system security testing should be a common occurrence. Often overlooked by IT, such testing will provide tuning for your security ecosystem and spot issues before the hackers do. While all this seems simple in concept, it’s actually a pain in the butt to deploy. If you’re dealing with all systems in an enterprise, organizational politics often pops up. Also, many enterprises lack the talent needed to get security going at all points. But you still need to do it, because the alternative is very unpleasant.


Remember: It’s not all about the 1s and 0s

spinning globe smart city iot skyscrapers city scape internet digital transformation
It’s critical that you control who has access to your physical plants – offices, warehouses, distribution centers, etc. I’ve seen people talk their way past guards and gates way too easily. Many offices I visit no longer have receptionists, which was traditionally the first line of defense, having been replaced by locked doors and badge readers. But if your employees don’t practice good access control, it’s all for naught. The biggest culprit here is tailgating – one employee badges-in to open a door and multiple people follow her into the office. I met someone who allowed a person to tailgate into her office building, and that person turned out to be an attacker who shot his ex-girlfriend once he was inside. But those same tailgaters might be there to steal your digital data as well. Last year, a medical devices firm in Massachusetts found a foreign national in their offices after hours trying to hack into their network. He had tailgated in at closing that day as employees rushed out of the office for Labor Day weekend.


CA Technologies' Agile Transformation: A Firsthand Perspective


Agile isn’t just a new way of developing software, it’s a new mindset, a new culture and a new way of running a business. That may sound like a lot to take on, but the results speak for themselves. Since we began practicing agile at scale and business agility, we’ve reduced time to market, improved customer satisfaction, boosted innovation and increased employee engagement. We’ve made it easier for leaders, lines of business and teams to work together -- and with our customers -- on a global scale.  But reinventing a business isn’t easy. It’s a transformation that must be meticulously designed as a series of experiments on and in a complex system, and these experiments need to be executed with discipline. Here’s a firsthand look at how CA Technologies successfully engineered our agile transformation, starting with our people. ... Investing in people is probably the most important part of any business transformation, so establishing a safe environment that nurtures employee confidence is essential. 


Lack of Business Associate Agreement Triggers HIPAA Fine
OCR's investigation revealed that ACH never entered into a business associate agreement with the individual providing medical billing services to ACH, as required by HIPAA, and failed to adopt any policy requiring business associate agreements until April 2014. "Although ACH had been in operation since 2005, it had not conducted a risk analysis or implemented security measures or any other written HIPAA policies or procedures before 2014," OCR says. "This case is especially troubling because the practice allowed the names and Social Security numbers of thousands of its patients to be exposed on the internet after it failed to follow basic security requirements under HIPAA," says OCR Director Roger Severino. In addition to the monetary settlement, ACH has agreed to implement a corrective action plan that includes the adoption of business associate agreements, completion of an enterprisewide risk analysis and the creation of comprehensive policies and procedures to comply with the HIPAA rules, OCR says.



Quote for the day:


"Success is most often achieved by those who don't know that failure is inevitable." -- Coco Chanel



Daily Tech Digest - December 05, 2018

The importance of today’s digital practitioner

null
The shift we are undergoing - from the physical to the digital - has led to companies embedding technology into the products they sell. Therefore, the importance of digital is moving from an IT functional world to one where digital practitioners are embedded into not only every part of the business, but also into every part of the products that the vast majority of companies take to market. This includes companies that historically have been very physical, such as aircraft engines or oil refineries, or any number of sectors where physical products are becoming digital. They now provide much more information to consume and therefore, more technology is integrated into the products that companies sell. A changing world requires the creation of new job roles to closely monitor the digital evolution, and this explains the emergence and importance of digital practitioners. Speed and agility Nowadays technology adoption occurs at a much faster rate, since most organisations now operate digitally. 


The Top 7 Technology Trends for 2019

Blockchain to Improve Industry Collaboration with Production-Ready Solutions
Naturally, blockchain is part of my list of technologies for 2019. It has been on my list in one form or another since 2015. The difference with previous years is that in 2019 we will see the first real enterprise applications in use. I am not talking about the various blockchain startups developing decentralised Applications (or dApps) nor am I talking about Proof of Concepts. In 2019, I think that we will see large corporations using blockchain to improve industry collaboration. These enterprise applications will predominantly use private blockchains. Within these networks, new actors have to be approved by existing participants, enabling more flexibility and efficiency when validating transactions. Organisations that will prefer to keep a shared ledger for settlement of transactions will be in the financial services industry or within supply chains. Especially for the latter, I see tremendous opportunities for 2019. In 2019, blockchain will become the gold standard for the supply chain. Simply because it offers clear benefits for participants in a network that have to trust each other to make it work.


Phishing at centre of cyber attack on Ukraine infrastructure


According to Carullo, phishing is one of the major attack vectors cyber criminals and other attackers use to target critical infrastructure. “This was demonstrated in our recent study around GreyEnergy, another piece of malware which was targeting critical infrastructure in Ukraine via phishing,” he said. “Today’s determined attackers are showing no signs of slowing down, so teaching staff to ‘think before they click’ is key to defending against these types of attacks.” Defending CNI from cyber attacks is not only about resisting attacks, but also about being resilient to ensure a quick recovery, according to Mike Gillespie, managing director and co-founder of security consultancy Advent IM. An unwillingness to accept that cyber attacks are a real threat to critical national infrastructure by UK political and CNI business leaders has resulted in a lack of resilience



Powerful, extensible code with Tagless Final in … Java!


Tagless final is all the rage in the Scala community at the moment. It is a technique that allows you to define the general structure of the code to be executed, such that we can configure different implementations (or interpreters) to inject effects such as Optionality, Asynchronicity, Parallelism, Non-Determinism, Error Handling depending on our needs. Need to simplify a highly concurrent piece of code for a test case so that you can debug a nasty production issue? With Tagless final — no problem! ... The nice thing about Monads is they all implement the same general suite of methods (of / map / flatMap and so on) that all behave in predictable ways. This means that if we define our programs to work with Monads, and define our Algebra in a generic manner so that the concrete Monadic type is pluggable — we can reuse the same Algebra and ‘program’ definition with many different Monadic datastructures. In fact we aren’t even constrained to using Monads


IoT roundup: Retrofitting vehicle tracking, plus a new IoT standard

nw iot news internet of things smart city smart home7
With millions of devices communicating different kinds of information via different kinds of networks, the IoT is crying out for some standardization, to help unlock its true potential as a transformative technology. Unfortunately, there is also a profusion of different standards, whether they’re from industry umbrella groups, technical committees or vendors pronouncing their connectivity framework as “standards.” Nevertheless, the International Organization for Standardization – a pan-industrial regulatory standards-setting body that’s been around in one form or another since the 1920s – has made its considerable presence felt in the world of IoT by ratifying the Open Connectivity Foundation’s OCF 1.0 specification as an international standard. The standard mandates public-key-based security, cloud management and interoperability for IoT systems in an attempt to create a useful, open framework for IoT.


Data protection, backup and replication in the age of the cloud


For data protection we have to consider on-premise applications as well as those running as cloud software platforms like Office365 or SalesForce.com. Public cloud services like these do not backup data by default other than to recover from system failure, so getting emails back after deletion is the data owner’s responsibility and so must be included in a data protection plan. With so much infrastructure accessible over the public internet, IT organisations also need to think about DLP – Data Loss Prevention. Or, probably more accurately, data leakage protection. We’ll discuss this later when talking about security. Data protection looks to meet the needs of the business by placing service level objectives on data protection and restore. In other words, recovery objectives drive protection goals. The two main measures are RTO (Recovery Time Objective) and RPO (Recovery Point Objective). RTO determines how quickly data and applications can be restored to operation, while RPO defines the amount of data loss tolerable.


How should CIOs manage data at the edge?


Because of this, it is imperative that CIOs build comprehensive security into any edge implementation proposal from the start. If security is bolted on after the business goals and ambitions of edge have been set, there will undoubtedly be trouble ahead. The need for processing at the edge comes from the sheer amount of data generated as our connected world expands over the coming years – according to DataAge 2025, a report sponsored by Seagate and conducted by IDC, 90% of the data created in 2025 will require security protection. More data, of course, means more vulnerability – which is why security, with intelligent data storage and data-at-rest encryption at its foundation, has to be at the heart of any business’s edge computing plans. Couple this with the increased physical concerns – more locations means that there are more sites to keep secure – and it’s clear that this is a complex challenge that must be managed methodically. Implementing edge is all about driving business growth – the new customer experiences and revenue streams that come with it will mean that your business expands and becomes more complex.


Defining RegTech: Why it Matters


As with many cases of disruption in tech, RegTech takes advantage of newer technologies. In order to provide scalability and convenient access, most RegTech offerings are based on the cloud, which helps connect new customers and provides an easy-to-access portal. Furthermore, RegTech relies on artificial intelligence for a number of tasks, including fraud detection and other actions. Machine learning, in particular, offers far better results than previous technologies, and RegTech companies can use data from their clients to improve their algorithms. RegTech is also emerging in the era of Big Data. Thanks to the modern tools and techniques used to analyze Big Data, RegTech companies can tap into large data sets and make sense of them, and they can pass this value on to their clients. Companies have long used software to aid in compliance. Furthermore, RegTech startups weren’t the first to incorporate artificial intelligence into the field. The emergence of the term RegTech, however, coincides with a time when companies are more inclined than ever to rely on cloud-based offerings.


10 predictions for the data center and the cloud in 2019

10 predictions for the data center and the cloud in 2019
Virtualization is nice, but it’s resource-heavy. It requires a full instance of the operating system, and that can limit the number of VMs on a server, even with a lot of memory. ... A container is as small as 10MB in size vs. a few GB of memory for a full virtual machine, and serverless, where you run a single function app, is even smaller. As apps go from monolithic to smaller, modular pieces, containers and serverless will become more appealing, both in the cloud and on premises. Key to the success of containers and serverless is that the technologies were created with the cloud and on-premises systems in mind and easy migration between the two, which will help their appeal. ... Bare metal means no software. You rent CPUs, memory capacity, and storage. After that, you provide your own software stack — all of it. So far, IBM has been the biggest proponent of bare-metal hosting followed by Oracle, and with good reason. Bare metal is ideal for what’s called “lift and shift,” where you take your compute environment from the data center to a cloud provider unchanged. Just put the OS, apps, and data in someone else’s data center.



Why interest in IT4IT is on the rise


Micro Focus distinguished technologist Lars Rossen, who wrote the original IT4IT specification, recognized that to manage the delivery of services in a digital enterprise, you need to have an operating model and a clear, end-to-end understanding of all the capabilities and systems surrounding it. That includes understanding everything about the key information artifacts you manage, such as service models, incidents, subscriptions, and plans. Prior to IT4IT's introduction (and particularly the release of its second version in 2015), there was no prescription for how to do this. "There were various frameworks like ITIL that described some of the processes you need to put in place, but none of them were comprehensive, prescriptive, and vendor-agnostic," Rossen said. "We created IT4IT to address this." The first version of IT4IT was constructed in collaboration with some large customers and consultancies, including Shell, PwC, HPE, AT&T, and Accenture, Rossen explained.



Quote for the day:


"Leadership is a process of mutual stimulation which by the interplay of individual differences controls human energy in the pursuit of a common goal." -- P. Pigors