Daily Tech Digest - July 03, 2019

How serverless computing makes development easier and operations cheaper

How serverless computing makes development easier and operations cheaper
Two of the biggest benefits of serverless computing should be clear: developers can focus on the business goals of the code they write, rather than on infrastructural questions; and organizations only pay for the compute resources they actually use in a very granular fashion, rather than buying physical hardware or renting cloud instances that mostly sit idle. As Bernard Golden points out, that latter point is of particular benefit to event-driven applications. For instance, you might have an application that is idle much of the time but under certain conditions must handle many event requests at once. Or you might have an application that processes data sent from IoT devices with limited or intermittent Internet connectivity. In both cases, the traditional approach would require provisioning a beefy server that could handle peak work capacities—but that server would be underused most of the time. With a serverless architecture, you’d only pay for the server resources you actually use. Serverless computing would also be good for specific kinds of batch processing.



Tempered Networks simplifies secure network connectivity and microsegmentation

Tempered Networks simplifies secure network connectivity
The TCP/IP protocol is the foundation of the internet and pretty much every single network out there. The protocol was designed 45 years ago and was originally only created for connectivity. There’s nothing in the protocol for security, mobility, or trusted authentication. The fundamental problem with TCP/IP is that the IP address within the protocol represents both the device location and the device identity on a network. This dual functionality of the address lacks the basic mechanisms for security and mobility of devices on a network. This is one of the reasons networks are so complicated today. To connect to things on a network or over the internet, you need VPNs, firewalls, routers, cell modems, etc. and you have all the configurations that come with ACLs, VLANs, certificates, and so on. The nightmare grows exponentially when you factor in internet of things (IoT) device connectivity and security. It’s all unsustainable at scale. Clearly, we need a more efficient and effective way to take on network connectivity, mobility, and security.


SOCIAL HUB -The latest trends on network transformation all in one place.

SA and NSA Network LTE
It’s reasonable to expect the nature of 5G services to change rapidly. A software-defined network infrastructure is flexible enough for CommSPs to speed the roll out of customizable applications and service models, ease customer provisioning and improve network operation and management efficiency. Staying in lock-step with evolving 3GPP specifications and 5G services implementation will require flexibility that only software-based infrastructure can provide. For example, one major US CommSP shared early plans for tiered 5G pricing based on data speeds, similar to broadband Internet pricing plans. The ability to support custom charging and new mobility service scenarios will be key to establishing, testing and evolving pricing structures and business models. ... Network engineers and architects can deploy these servers in the core or network edge, which makes it possible to scale to multi-terabit configurations in the core network and share consistent infrastructure and software with distributed locations. CommSPs can use this network infrastructure to apply a cloud native architecture that drive efficiencies, speed deployments and meet SLA requirements that have very different requirement from the cloud computing industry.



TA505 Group Launches New Targeted Attacks

The evasion and anti-analysis capabilities built into modern malware tools like AndroMut highlight the need for multilayered protections. In addition to securing emails and endpoint devices, organizations need to monitor for malware communication with command-and-control systems, Dawson notes. For enterprises, the threat posed by TA505 appears to be growing, according to Proofpoint. The group is behind some of the largest email campaigns ever, including one to distribute the Locky ransomware. Through 2017 and the first half of 2018, TA505 launched such massive campaigns that they dramatically affected global malicious email volumes, Dawson says. "The group saturated organizations with Locky ransomware and the Dridex banking Trojan," he notes. When TA505 shifted to smaller — though still relatively large — campaigns distributing RATs and other malware, it triggered a similar shift in this direction among other attackers that continues today, Dawson says.


The 'Going Dark' Debate: It's Back

The 'Going Dark' Debate: It's Back
The impetus, as usual, is law enforcement and intelligence agencies' concern over "going dark." In other words, suspects in an investigation - centering on child abuse, terrorism, drug trafficking or any other type of criminality - might be using communications techniques on which investigators cannot easily eavesdrop. The NSC advises the president on national security matters and coordinates policies across government departments. Last week's gathering of the NSC's Deputies Committee, three unnamed people with knowledge of the meeting told Politico, does not appear to result in any decision to change current policies. "The two paths were to either put out a statement or a general position on encryption and [say] that they would continue to work on a solution, or to ask Congress for legislation," one of the people told Politico. One of the chief proponents for anti-crypto legislation was Deputy Attorney General Rod Rosenstein, but with his departure, the appetite for legislation meant to tackle the "going dark" problem has appeared to wane, Politico reports.


Disaster recovery readiness is essential for hybrid and multi-cloud strategies

disaster recovery readiness
Use of the cloud within a disaster recovery plan offers many benefits, including reliability and cost efficiency, as there is no need to invest in infrastructure that may never be used. Cloud resources can be offsite, mitigating the risk of a disaster affecting the main office location, and can be accessed (and paid for) only as needed. A multi-cloud disaster recovery strategy offers additional peace of mind that critical systems and data will remain easily accessible when needed. Although hybrid and multi-cloud deployments are widely acknowledged as good practice, IT professionals highlight complexity, training gaps and lack of internal resources in their hesitancy to deploy using multiple clouds. Nevertheless, more than half the respondents were operating in a multi-cloud environment, with nearly one in ten using five or more clouds within their organizations. “What we’re hearing from customers, and is consistent with our survey findings, is that they’re looking for ways to simplify and streamline their cloud deployment and management,” said Ziad Lammam, vice president of product management for Teradici.


Why AI Will Replace Rocket Scientists Before It Ever Replaces Marketers


According to world-renowned inventor and futurist Ray Kurzweil, looking at AI as a threat is unnecessary. Instead, humans should embrace technological advancements and allow them to, in turn, make us smarter. Machine learning has come a long way in recent years. AI algorithms have been honed and perfected, enabling machines to learn and update on their own. While this has affected all walks of life, when it comes to marketing, AI has helped improve the customer experience exponentially. In today’s day and age, consumers expect companies to always be on, and they expect messages to be personalized. AI helps marketers achieve this level of personalization without having to work 24/7. It’s ironic because this automated, mechanical tool is making marketing more personalized and human. ... This is where a marketer's touch and human intelligence come to play. At this point, and in the near future, there will still be a need for a human marketer behind AI tech to help steer the campaign in the right direction. If you're looking for evidence of this, consider the many issues that came to light as programmatic advertising gained traction.


Robot maps a room using just sound and AI

Humanoid robot
The researchers note that the shape of a room can be acoustically determined from corresponding room impulse responses (RIR), which can be extracted from recorded sound signals. Exploiting this fact, they considered time of arrivals (TOAs), or the time it takes for sound to travel from a source to a microphone. If the TOAs are known, they posited, the distance from the microphone to a target location can be inversely computed. But knowing TOAs isn’t enough, because the distances are unlabelled and acoustic sensors record reflections and echoes in an arbitrary order. To solve for this, the team tapped a four-microphone array (the fourth microphone was used to verify the distances) and used a reflective point, which in this case refers to the intersection between the line from a target spot in the room to a microphone and a potential wall line. If the reflective point and the real sound source were on different sides of a reconstructed or potential wall line, the proposed system treated the target spot as noise and discarded the data.


Origami Inspired Robot Can Pack Your Groceries

Origami robot
The latest soft robot from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), however, breaks the mould. Appearing more like a flower than a piece of machinery, CSAIL’s Origami Robot Gripper is a collapsible skeleton that can suck up objects using a vacuum. Its rubber skin aids with grip, allowing the robot to pick up items from any angle regardless of their shape. Currently, ‘hard’ robots struggle with non standard shapes and, unlike the Origami inspired bot, can easily apply too much or too little force. Another benefit of the soft robot is that it is lightweight, and made with relatively inexpensive materials. This means it is cheaper and, thanks to a simple design, less complex to make. And, instead of requiring extensive programming to handle different shapes and sizes, its vacuum can pick up a variety of products from mushrooms to bottles of wine. It’s also capable of lifting 100 times its own weight. Scale up the design, and the Origami gripper could reliably retrieve a whole range of items. The most obvious application for MIT’s model is in groceries, either at physical checkouts or in warehouses.


More US Cities Battered by Ransomware

Despite advice from the FBI that organizations should not pay ransoms, the decision is increasingly being looked at from a cost/benefit perspective. Insurance policies may cover ransoms, and the option may look appealing if the cost of recovery is more than the ransom. And as ProPublica reported last month, some forensics firms that claim to be able to resolve a ransomware infection are actually paying the ransom while passing the cost onto their customers. Plus, there's the vexing question over who is profiting from the ransom. ProPublica traced four ransom payments made by Proven Data Recovery, a firm based in New York. The payments - made to get the decryption key for a SamSam infection - ended up in bitcoin wallets linked to Iran. The city of Baltimore, however, refused to pay a ransom after a recent attack and endured an estimated $18 million in recovery costs. The city was affected by the Robbinhood ransomware, which forced the city to revert to manual processes



Quote for the day:


"Great achievers are driven, not so much by the pursuit of success, but by the fear of failure." -- Larry Ellison


Daily Tech Digest - July 02, 2019

TIN coalition calls for industry action against cyber fraud


The vision for overcoming social engineering challenges is to reduce the opportunities to establish false trust and to ensure that all remaining threats are well publicised and understood. The vision also requires organisations to interact with customers and staff in a way that reinforces security and to ensure that the security of interactions with individuals becomes less dependent on public information. To address operating in silos, the vision is to ensure that cyber fraud is understood across functions within and between organisations, to ensure that organisations are recognised for sharing useful information, not punished for suffering an attack, and to ensure that business and law enforcement collaborate effectively to tackle cyber fraud. And to reduce the gap between cyber security and anti-fraud operations, the vision is to ensure that the response to cyber attacks minimises the broader impact of data loss on society, that fraud teams in business and law enforcement are fully engaged in tackling cyber attacks as a precursor to fraud, that enforcement is globalised to tackle all forms of cyber fraud


Big Data Is Dead. Long Live Big Data AI.

Getty
“The value of the data analytics market can’t be ignored. The Looker and Tableau acquisitions demonstrate that even the biggest tech players are snapping up data analytics companies with big price tags, clearly demonstrating the value these companies have in the larger cloud ecosystem. And in terms of what this means for the evolution of AI, we’ve reached a point where we have more than enough anonymized data to train the system, and now it’s a matter of honing how we use the AI to extract the maximum value from data”—Amir Orad, CEO, Sisense “The Google Cloud/Looker and Salesforce/Tableau acquisitions are a direct reaction to the rate at which analytics workloads have been shifting to the cloud over the past few years. The state of AI is a reflection of this shift as machine learning, AI and analytics have become the primary growth opportunities for the cloud today. Yet, it's this same growth that is causing barrier to success as AI project overwhelming face the same problem -- data quality”—Adam Wilson, CEO, Trifacta


What can you do with the Microsoft Graph?


Working with the APIs can be tricky; it can be hard to construct the right query, especially if you're looking for more complex graph queries. Microsoft offers tools to help build and test queries, as well as SDKs that can simplify adding Graph support to your apps. One, the web-based Graph Explorer, allows you to try out queries without logging in to an Office 365 account. It provides sample queries that show how to extract specific information from the service, with a library of different queries to get started. You can only use GET queries against sample data; POST requires your account details and your data. Once you're ready to start working with live data, you can log in with a Microsoft account, and start using your Microsoft 365 tenant. The list of query categories is long, covering working with users, with mail and calendar, as well as files and apps. The Graph Explorer doesn't only show production queries, it supports beta APIs, so you can experiment before adding them in your code. Queries can be cut-and-pasted from the Explorer, and you can see any request headers or bodies that need to be constructed and delivered with the REST HTTP query.



Offensive Security launches OffSec Flex, a new cybersecurity training program

Organizations can now use OffSec Flex to purchase blocks of Offensive Security’s industry-leading practical, hands-on training, certification and virtual lab offerings, allowing them to proactively increase and enhance the level of cybersecurity talent available within their organizations. With Offensive Security’s hands-on courses, labs and exams readily available, organizations are able to offer educational opportunities to new hires and non-security team members alike, improving their security posture and equipping their employees with the adversarial mindset necessary to protect modern enterprises from today’s threats. “Cybersecurity training is not just for security professionals anymore,” said Kerry Ancheta, VP of Worldwide Sales, Offensive Security. “Increasingly we see organizations recommend pentest training courses for their software development or application security teams in order to improve their understanding for how their systems and applications are attacked.


Calculating The Cost of Software Quality in Your Organization


Basically, the costs of software quality (COSQ) are those costs incurred through both meeting and not meeting the customer’s quality expectations. In other words, there are costs associated with defects, but producing a defect-free product or service has a cost as well. Calculating these costs serves the purpose of identifying just how much the organization spends to meet the customer’s expectations, and how much it spends (or loses) when it does not.  Knowing these values allows management and team members across the company to take action in ensuring high quality at a lower cost. While analyzing the COSQ at an organization may lead to the revelation of uncomfortable truths about the state of quality management at the company, the process is important for eliminating waste associated with poor quality. This often requires a mindset and culture shift from viewing software quality defects as individual failures to seeing them as opportunities to improve as a collective team.


Machine learning has been used to automatically translate long-lost languages


It’s not hard to imagine that recent advances in machine translation might help. In just a few years, the study of linguistics has been revolutionized by the availability of huge annotated databases, and techniques for getting machines to learn from them. Consequently, machine translation from one language to another has become routine. And although it isn’t perfect, these methods have provided an entirely new way to think about language. Enter Jiaming Luo and Regina Barzilay from MIT and Yuan Cao from Google’s AI lab in Mountain View, California. This team has developed a machine-learning system capable of deciphering lost languages, and they’ve demonstrated it by having it decipher Linear B—the first time this has been done automatically. The approach they used was very different from the standard machine translation techniques. First some background. The big idea behind machine translation is the understanding that words are related to each other in similar ways, regardless of the language involved.


The Agile Manifesto: A Software Architect's Perspective

Specifications with an architectural impact (in the form of new user stories) should be tracked by the architect and assessed in a pragmatic approach by the whole development team, including experienced developers, test engineers, and devops. Bad habits from the past, when the architect created on paper the full blown technical design for the team, do not fit within modern agile environments. There are multiple flaws with this model, which I also faced in my daily basic work. First and most important, the architect might be wrong. This happened to me after I created a detailed upfront technical design and presented it to development team during Sprint refinements. I got questions related to cases I did not think about or I failed to take into account. In most of the cases, it turned out the initial design was either incomplete or impractical, and required extra work. Big upfront design limits the creativity and autonomy of the team members, since they must follow a recipe which is already granted. From a psychological standpoint, even the author might become biased and more reluctant to change it afterwards, trying to prove it is correct rather than to admit its flaws.


Essential tips for scaling quality AI data labeling


Data scientists are using labeled data and natural language processing (NLP) to automate legal contract review and predict patients who are at higher risk of chronic illness. The success of these systems depends on skilled humans in the loop, who label and structure the data for machine learning (ML). High-quality data yields better model performance. When data labeling is low quality, an ML model will struggle to learn. According to a report by analyst firm Cognilytica, about 80 percent of AI project time is spent on aggregating, cleaning, labeling, and augmenting data to be used in ML models. Just 20 percent of AI project time is spent on algorithm development, model training and tuning, and ML operationalization. These tasks are at the heart of AI development and require strategic thinking, along with a more advanced set of engineering or computer science skills. It’s best to deploy more expensive human resources — such as data scientists and ML engineers — on tasks that require expertise, collaboration, and analytical skills.


Effective or Not? The Real Impact of GDPR


The General Data Protection Regulation wasn’t just meant to give governments the means to enforce data security rules. Another key objective was to change how both companies and users behave when it comes to ensuring personal data remains private and protected. In this sense, GDPR seems to have had the desired impact. ... Another interesting fact the data shows is that users may have moved some of their own responsibility to GDPR enforcers. Two indicators led to this observation: “Respondents are less likely to read privacy statements than they were in 2015 (-7 percentage points) “17% say it is enough for them to see the website has a privacy policy” so they choose not to read the document at all. A similar behavior pattern emerges when dealing with social media usage. Less users – 56% in 2019 vs 60% in 2015 – actually change their privacy settings for their personal profile. The three most common reasons social network users give for not trying to change their personal profile’s default settings are that they trust the sites to set appropriate privacy settings (29%) that they do not know how to (27%), or that they are not worried about sharing their personal data (20%).


5 steps for digital workplace transformation


Start by recognizing actionable opportunities within your business operations. Approach the prospects for digital transformation from a business instead of technology perspective. Line-of-business (LOB) teams should lead this effort, coordinating closely with senior IT staffers to identify critical barriers to success. Of course, each organization faces its own set of challenges. But, at the onset, step back and identify key themes -- accelerating innovation, enhancing productivity, improving governance or reshaping the steps in the customer journey -- that make good business sense. Consider operations as a whole, while focusing on people and processes, and determine your target audiences: employees, partners and/or customers. Then, engage a cross section of these audiences in conversations about what they are doing and how they understand the underlying business purposes. Develop both the technology and the business insights about what is happening from the participants' perspectives. Listen carefully as they describe their tasks, and be sure to observe how they do their work to determine where bottlenecks occur.



Quote for the day:


“The real voyage of discovery consists not in seeking new landscapes but in having new eyes.” -- Marcel Proust


Daily Tech Digest - July 01, 2019

Automation Is Becoming A C-Suite Priority

Automation is becoming a C-Suite priority - CIO&Leader
While automation maturity is at its highest in the US, with over 60% of organizations making extensive use of automation, there are some interesting findings from India. The country shows the maximum level of enthusiasm about automation among CIOs and other senior executives. 84% believe RPA is a high or essential priority to meet strategic business objectives for Indian businesses as against the global average of 76%. Also 90% C-level executives expect their company’s financial results to improve as a result of automation, namely profitability, operating costs and revenue growth. Sector wise, IT and manufacturing have outpaced other industries in automating business processes. By contrast, government and public sector institutions have made the least headway among surveyed sectors. Of CIOs who have implemented automation, most have automated highly repetitive back-office functions. “Automation of functions is most extensive in IT, operations and production, customer service and finance.



FTC data privacy enforcement will threaten corporate bottom lines


Despite the mounting concerns over data security and privacy practices that put consumers’ data at risk, the U.S. Congress still has yet to adopt national legislation to address cybersecurity, and security spending will see a nominal increase given the current administration’s recent budget proposal. Consequently, organizations are subject to a patchwork of laws and regulations relevant to cybersecurity and privacy practices, including differing laws and regulations in each state and the District of Columbia, as well as from multiple federal administrative agencies. Therefore, the FTC has taken a comprehensive directive to extend its supervision over all companies operating in the United States. In fact, the FTC has assumed a leading role in policing corporate cybersecurity practices since 2002. Since that time, it brought more than 200 cases against companies for unfair or deceptive practices that endangered the personal data of consumers.



Huge jump in cyber incidents reported by finance sector


Overall, Snaith said there remain serious vulnerabilities across some financial services businesses when it comes to the effectiveness of their cyber controls. “More needs to be done to embed a cyber resilient culture and ensure effective incident reporting processes are in place,” he said. UK law enforcement is also calling for improvements in cyber crime reporting. “It is crucial that businesses report cyber crime to us because every incident is an investigative opportunity,” Rob Jones, director of threat leadership at the UK National Crime Agency (NCA) told Computer Weekly. “Failure to report creates an unpoliced space and a situation where incident response companies just sweep up the glass, but don’t deal with the underlying issue, which emboldens criminals,” he said. “As a result, the problem will continue and prevalence, severity and sophistication of attacks will increase.” Nigel Hawthorn, data privacy expert at security firm McAfee said that it is widely recognised that cyber incidents were previously under-reported. 


How does the CVE scoring system work?

securityhero.jpg
The first thing to understand is that there are three types of Metrics used in this system: Base Score Metrics - depends on sub-formulas for Impact Sub-Score (ISS), Impact, and Exploitability; Temporal Score Metrics is equal to a roundup of BaseScore * EploitCodeMaturity * RemediationLevel * ReportConfidence; and Environmental Score Metrics - depends on sub-formulas for Modified Impact Sub-Score (MISS), ModifiedImpact, and ModifiedExploitability. The formula for this is Minimum ( 1 - [ (1 -ConfidentialityRequirement * ModifiedConfidentiality) * (1 - IntegrityRequirement × ModifiedIntegrity) * (1 - AvailabilityRequirement * ModifiedAvailability) ], 0.915). Within each set of metrics are the following sub-categories: Base Score Metrics: Attack Vector, Attack Complexity, Privileges Required, User Interaction, Scope, Confidentiality Impact, Integrity Impact, Availability Impact; Temporal Score Metrics: Exploitability, Remediation Level, Report Confidence; and  Environmental Score Metrics: Attack Vector, Attack Complexity, Privileges Required, User Interaction, Scope, Confidentiality Impact, Integrity Impact, Availability Impact, ...


AI is changing the entire nature of compute


"Hardware capabilities and software tools both motivate and limit the type of ideas that AI researchers will imagine and will allow themselves to pursue," said LeCun. "The tools at our disposal fashion our thoughts more than we care to admit." It's not hard to see how that's already been the case. The rise of deep learning, starting in 2006, came about not only because of tons of data, and new techniques in machine learning, such as "dropout," but also because of greater and greater compute power. In particular, the increasing use of graphics processing units, or "GPUs," from Nvidia, led to greater parallelization of compute. That made possible training of vastly larger networks than in past. The premise offered in the 1980s of "parallel distributed processing," where nodes of an artificial network are trained simultaneously, finally became a reality.  Machine learning is now poised to take over the majority of the world's computing activity, some believe. During that ISSCC in February, LeCun spoke to ZDNet about the shifting landscape of computing. 


SOLID Principles: Interface Segregation Principle (ISP)

Image 2 for SOLID Principles: Interface Segregation Principle (ISP)
A great simple definition of the Interface Segregation Principle was given in the book you have already heard of, “Agile Principles, Patterns, and Practices in C#”. So, the definition is:“The Interface Segregation Principle states that Clients should not be forced to depend on methods they do not use.” ... Here is an interesting historical note about the ISP. I’m pretty sure that ISP was first used long ago before Robert Martin, but the first public formulation belongs to Robert C. Martin. He applied the ISP first time while consulting for Xerox. Xerox had created a new printer system that could perform a variety of tasks such as stapling and faxing. The software for this system was created from the ground up. As the software grew, making modifications became more and more difficult so that even the smallest change would take a redeployment cycle of an hour, which made development nearly impossible. The redeployment cycle took so much time because at that time there were no C# or Java, these languages compile very fast. What we can’t say about C++ for example. Bad design of a C++ program can lead to significant compilation time.


Mist Wi-Fi no longer just cloud


The Mist Edge hardware appliance avoids having access points (APs) in each office on a campus communicate directly with Mist's WxLAN technology in the cloud. Instead, WxLAN policies created through Mist's cloud-based dashboard are stored in the on-premises appliance. WxLAN policies assign resources, such as servers and printers, to groups of users. Network managers can also create a service set identifier for a select group of users and assign services or devices that only they can access. Mist Edge is available only as a stand-alone appliance. Mist plans to ship the appliance's software on a virtual machine this year. Mist Edge reflects the preference of some enterprises to split management technology between the cloud and on premises. Companies more comfortable with an on-site WLAN controller, for example, could switch to Mist Edge, said Brandon Butler, an analyst at IDC. "Overall, we see more and more enterprises gaining comfort with managing their WLAN environments from the cloud but giving customers a choice in how to manage their environments is always good," he said.


Beyond Limits: Rethinking the next generation of AI

A human profile containing digital wireframe of technology connections.
Beyond Limits evolved out of work with NASA's Jet Propulsion Laboratory (JPL) for remote rovers used to explore places like the moon and Mars. Due to the communications lag in space, real-time control is virtually impossible. Any AI solution must be not only fully autonomous, it must be able to train and, ideally, correct itself. When there is a problem it can’t correct, the bandwidth limitations for communication make full reprograming problematic…but point patches are certainly possible. This resulted in an AI platform uniquely able to be updated, modified and, to a certain and initially limited extent, able to both teach itself and make corrections while disconnected. This unusual requirement likely has made the resulting AI nearly ideal for areas where the AI must often act independent of oversight – and/or in areas where problems can escalate very rapidly – and the AI must be able to both deal with a diversity of known and unknown issues. ... Although still its infancy, Beyond Limits represents a new class of AI. It’s better enabled to operate fully autonomously, it can both learn on the fly and increasingly make corrections to its own programing


HPE promises 100% reliability with its new storage system

HPE promises 100% reliability with its new storage system
Primera was announced last week at HPE’s Discover event in Las Vegas. Phil Davis, chief sales officer for HPE, said in the announcement keynote, “If you think about traditional storage, it’s full of compromises and complexity. Do I want fast or reliable? Do I want agility or simplicity? But not any more. We’re going to combine the simplicity of Nimble with the intelligence of Infosight and mission-critical heritage of 3Par and we’ve created a new class of storage that eliminates the traditional compromises and truly redefines what is possible with storage.” Davis said Primera will run out of the box with just a few cable connections and be can be autoprovisioning storage within 20 minutes. That means no need for IT consultants to install and configure the hardware. The more workloads you add to a storage system, the more unpredictable latency becomes. Using InfoSight’s parallelism, Primera improved throughput and latency of an Oracle database by 122% over the prior storage system, which HPE did not identify.


Using AI-powered intelligent automation for digital transformation success

A maturity model assessment begins with evaluating automation readiness from a technology and process perspective. IT should be involved in the discussion early on because they understand how automation technologies will fit within the larger IT framework. They’re also responsible for managing the environments that these technologies operate in and for ensuring proper security protocols are followed throughout the deployment process. From a business process and operations standpoint, organizations should assess how well-documented current processes are during this stage. If there’s room to improve prior to automation, this presents an opportunity to make upfront investments in this respect. Automation is most powerful when deployed against processes that are already running properly; it isn’t intended to ‘fix’ or alleviate the pain points around broken processes. In other words, optimize first and then automate for the best results.



Quote for the day:


"One of the sad truths about leadership is that, the higher up the ladder you travel, the less you know." -- Margaret Heffernan


Daily Tech Digest - June 30, 2019

How a quantum computer could break 2048-bit RSA encryption in 8 hours


Shor showed that a sufficiently powerful quantum computer could do this with ease, a result that sent shock waves through the security industry. And since then, quantum computers have been increasing in power. In 2012, physicists used a four-qubit quantum computer to factor 143. Then in 2014 they used a similar device to factor 56,153. It’s easy to imagine that at this rate of progress, quantum computers should soon be able to outperform the best classical ones. Not so. It turns out that quantum factoring is much harder in practice than might otherwise be expected. The reason is that noise becomes a significant problem for large quantum computers. And the best way currently to tackle noise is to use error-correcting codes that require significant extra qubits themselves. Taking this into account dramatically increases the resources required to factor 2048-bit numbers. In 2015, researchers estimated that a quantum computer would need a billion qubits to do the job reliably. That’s significantly more than the 70 qubits in today’s state-of-the-art quantum computers.



How Urbanhire is disrupting HR in Indonesia

Specifically, the hiring platform allows companies to post jobs across more than 50 portals, including Google, LinkedIn and Line - a freeware app which became Japan’s largest social network in 2013. Tapping into a pool of more than one million active jobseekers, the software-as-a-service (SaaS) follows a “data-driven hiring strategy”, aligning businesses to a four-step digital strategy of “source, assess, recruit and on-board”. Three years since launching, key customers include global brands such as AIA, Zurich and The Body Shop, in addition to Indonesian organisations like Danamon, Pertamina and Djarum. “Indonesia is a fantastic opportunity given where it is at from a growth perspective,” Kamstra added. “As a tech entrepreneur, I love the fact that we can use business models that have been successful in more developed countries without a lot of the baggage that comes with historical tech implementations that are no longer sufficient. “I love to use the telecom industry as an example. Indonesia was able to go from little infrastructure to a very modern one by not having gone through all the investment steps that countries like the US were forced to do as pioneers.


Don't Miss These 10 Cybersecurity Blind Spots

uncaptioned
When an employee is terminated, it’s important to shut down their access to all work-related accounts — immediately. Ideally, you might want to try to automate as much of the account-termination process as possible and ensure that the process covers all accounts for all employees. This can be easier said than done, but it's important to get a process or automated solution nailed down before that employee's access causes an unwanted breach. ... Any application that uses third-party software components, including open-source components, takes on the risk of potential vulnerabilities in those dependencies. These vulnerabilities should be identified, tracked and accounted for in the same way as every other software component. ... Service accounts are used by machines, and user accounts are used by humans. The trouble with service accounts is that sometimes they have access to a lot of different systems, and their passwords aren’t always managed well. Poorly managed passwords make for easy compromise by attackers.


Business needs to see infosec pros as trusted advisers

The first issue clouding communication between security professionals and the board or senior business leaders is the misunderstanding that IT risk is separate from business risk. Nothing could be further from the truth, especially considering that in most organisations today, the separation between what is IT and what is business is hard to identify because technology is the backbone of everything the business does. The second issue relates to how the message is packaged. Is the language full of technical jargon, or is it simple to understand and gets the message across in business terms? Does it highlight the loss to the business in terms understood by the board and senior business leaders? Take the example of when business downtime is required when a patch needs to be applied. Instead of talking in terms of the technical threats and the outcomes of poor patching, security professionals would be more effective explaining it in terms of loss to the business, such as lost opportunities or losses from an attack that may occur because of the unpatched status.


MongoDB CEO explains where the company has an edge over database giant Oracle


Cramer noted that Oracle, which has a nearly $195 billion market cap, has recently bought back billions in stock and has a big war chest. Despite that, MongoDB's architecture sets the younger company apart, Ittycheria said. The firm's database is built for the modern world, he added. "[Oracle] built an architecture designed in the late '70s for the world then, and they just tried to make it better over time," he said. "We built an architecture design for today's high performance mobile cloud computing world." Ittycheria explained how MongoDB helped Cisco address an order management application issue in which they receive tens of billions of orders from different sales channels a year. The platform serves more than 14,000 customers, including some of the most "sophisticated, demanding customers in the world." The list ranges from big media to telecom to gaming media to financial services, he said. Start-ups are also developing their business on MongoDB, Ittycheria said.


Fix your cloud security

Fix your cloud security
Enterprises are either not willing to use the right technology, or they don’t understand that the technology exists. It’s not that the database is unencrypted, it’s that nobody has any idea how to turn on encryption in flight or at rest. Also at fault are the “it was not on-premises” folks out there. They cling to the fact that since some security feature was not a part of the original on-premises system, it shouldn’t be needed in the cloud. The time to deal with security issues is when you move from on-premises to the public cloud. You need to spend at least a couple of weeks looking at identity access management, encryption, auditing, proactive security, and more, and then evaluating its viability to your enterprise. Otherwise, you could miss the cloud security boat as you make the migration.  In my opinion, this is the single most important step in migration. It allows you to reflect on what your security needs really are and how to solve them using cloud computing technology which, these days, is better than anything you can find on-premises.


Can Apple compete on privacy?


Apple's privacy campaign has already had an impact in terms of forcing the competition to pay closer attention to their disclosure and controls. It is unlikely to move the needle in terms of market share, but Apple can only gain as awareness of the great data tradeoff of targeted advertising grows and missteps in executing it continues. It should also be more effective as a retention tool for anyone who has not already been locked into Apple's milieu through its self-reinforcing portfolio of devices and growing family of services. Furthermore, while the smartphone market is mature, whatever challenges it as an emerging platform will likely raise even more profound privacy concerns. Already, wearables measure our pulse and assess whether we've fallen, and the kind of personal data that could be generated by measuring exactly what you're looking at via augmented reality gear could make smartphone-generated data seem crude by comparison. And there's another potential benefit to Apple's privacy campaign, one that the company has developed since it first stepped up its advocacy.


Serverless: applications only when you need them - no more, no less

Traditional IT architectures use a server infrastructure, whether on-premises or cloud-based, that requires managing the systems and services required for an application to function. The application must always be running, and the organization must spin up other instances of the application to handle more load which tends to be resource-intensive. Serverless architecture focuses instead on having the infrastructure provided by a third party, with the organization only providing the code for the applications broken down into functions that are hosted by the third party. This allows the application to scale based on function usage and is more cost-effective since the third-party charges for how often the application uses the function, instead of having the application running all the time. ... Serverless computing is constrained by performance requirements, resource limits, and security concerns, but excels at reducing costs for compute. That being said, where feasible, one should gradually migrate over to serverless infrastructure to make sure it can handle the application requirements before phasing out the legacy infrastructure.


Four Myths of Digital Transformation: What Only 8% of Companies Know


New research by Bain & Company finds that only 8% of global companies have been able to achieve their targeted business outcomes from their investments in digital technology. Said another way, more than 90% of companies are still struggling to deliver on the promise of a technology-enabled business model. What secret formula do the 8% deploy? Unsurprisingly, there are no shortcuts or silver bullet. But successful transformations do share some common themes. One of the most important is understanding that this is really a business transformation, supported by investments in new technology—not new technology in search of opportunities. Many executives pay lip service to this idea, but in practice, they delegate too much responsibility to the tech team, hoping the business can watch from the sidelines. At the 8%, executive teams understand that the core of a digital transformation is a business transformation, changing the way of engaging customers across channels, simplifying business processes, and redesigning products or services.


“We need to up our game”—DHS cybersecurity director on Iran and ransomware

Both the Iranian malicious activities and ransomware attacks are largely dependent on exploiting the same sorts of security issues. Both rely largely on the same tactics: malicious attachments, stolen credentials, or brute-force credential attacks to gain a foothold on targeted networks, usually using readily available malware as a foothold to use those credentials to then move across a network. When asked if the recent ransomware attacks on cities across the US (including three recent attacks in Florida with dramatically larger ransom demands) were indicative of a new, more targeted set of campaigns against US local governments, Krebs said that the attacks were likely not targeted—at least not initially. "I still think these [ransomware campaigns] are fairly expansive efforts, where [the attackers] are initially scanning, looking for certain vulnerabilities, and when they find one that's when they start to target," he said. "Again, I'm not sure we have the information right now saying they were specifically targeted.



Quote for the day:


"Leaders stuck in old cow paths are destined to repeat the same mistakes. Change leaders recognize the need to avoid old paths, old ideas and old plans." -- Reed Markham


Daily Tech Digest - June 29, 2019

India gears up for historic data protection law

India gears up for historic data protection law
India is getting ready for the law after the Narendra Modi government listed it as one of the bills in the Parliament last week, in the first session after the general elections. The election in April gave Modi a second term with a massive majority. The bill will create a legal regime for how data can be shared, stored and used in India. The proposed law, once passed by Parliament, will have major consequences for technology companies hoping to build businesses that access user data. Most technology companies like Google, Facebook and others now thrive on data generated by their users to earn billions of dollars worldwide. Mukesh Ambani who heads Reliance Industries, India’s biggest firm working in the energy, telecommunications and retail space, pointed out in January this year that “data is the new oil“. However, while the cabinet is yet to clear the final draft of the data protection bill, technology and privacy experts are concerned about the implications of the proposed new law. Many companies, including the technology giants hoping to tap into India’s massive markets, are apprehensive about what this entails for their core business models.



Hands on a screen for biometric identity access
“You have to be careful when there is sensitivity around personal data,” Kampman said. Whether it’s AI or any identity-related effort, “you need governance over this to be clear about what can be used and what can’t be used for a given purpose. You are a custodian of data and when you aggregate that data your responsibilities increase exponentially.” Broadly, the looking-before-leaping paradigm is in full force here. As government IT leaders and their business-line peers seek to better manage access and identity in an emerging cloud-driven enterprise, they’ll need to be thoughtful not just about the how, but about the why behind their efforts. “There needs to be a strategy,” Kampman said. “What is the outcome going to be? The technology world can solve these problems but it needs to be done with a viewpoint toward how it will appear to the end user. You want to have control over the technologies but you also want all the stakeholders to have an opportunity to contribute toward governance.”



Google has more deep data knowledge than any company in the world, and it is no slouch in the discipline of design. It’s only natural that the company would combine this expertise. Initially, the audience for the new data design guidelines was Google itself, but much as it did for Material Design, the company decided to publicize these best practices and encourage others to adopt them—anyone from app developers to everyday people who are left wondering why their PowerPoint chart sucks. “We started doing this internally as a way to guide [employees] through the do’s and don’ts of chart creation,” Lima tells Fast Company. “After conducting various research studies and partnering with teams across the company, the do’s and don’ts evolved into a set of high-level principles that were strongly rooted in Google-wide tenets crucial to the company’s growth, brand, and culture. These principles are meant to be generative and not prescriptive. We hope they can help any chart creator during ideation and evaluation.” The six principles read something like an introductory data design course.


Blockchain Technology: Enabling Enterprise Innovation

uncaptioned
The most satisfying finding from Deloitte is that business leaders are taking blockchain as seriously as we’d hoped. Deloitte found that “53 percent of respondents say that blockchain technology has become a critical priority for their organizations in 2019—a 10-point increase over last year.” That more than half of the respondents name blockchain technology as a critical priority is, in my eyes, the first tremor in what promises to be a substantial shake up of the business technology landscape. Accordingly, when the authors report that many leaders are focusing less on whether blockchain works (spoiler: it does) and more on what business models it might disrupt, they quote Deloitte Consulting LLP Principal Linda Pawczuk, Deloitte consulting leader for blockchain and cryptocurrency. She says, “We believe executives should no longer ask a single question about blockchain but, rather, a broad set of questions reflecting the role blockchain can play within their organization.”


Here Is A Look At Where Fintech Is Leading Us And Why


The business consultancy powerhouse reports that there is an abundance of fintech enterprises entering the market using novel business models and delivering fresh consumer offerings. Furthermore, says E&Y, the emerging fintech revolution is driving information sharing and the development of open-source Application Program Interfaces (APIs) as well as recent technological breakthroughs in artificial intelligence (AI) and biometrics. Around the world, lawmakers are following the example of Europe by promoting open access Application Programming Interfaces (APIs). By doing so, legislators desire to enhance consumer choice by increasing competition between banks and fintech enterprises. For new fintech firms, open-source APIs streamline the launch of new products and services and decrease costs customarily used for research and development. New fintech banks that build their organization around a digital business model represent the fastest growing segment of startups nurtured by this movement.


Image Classification Using Neural Networks in .NET

Image classification is one of the most common use cases for non-recurrent neural networks. The basic concept is that a neural network is given an input image, whose input layer has the same number of neurons as the pixels in the image (assuming the image is grayscale). Also depending on the number of classifications to be made available, this neural network should have the same number of output neurons. The neural network could use either convolutional, fully connected layers or a combination of both. Convolutional networks are faster as they squish the input image and convolute them using multiple kernels to extract important features. More details on convolution can be found here. Convolution greatly reduces the size of the fully connected networks which are used to classify the image after series of convolutions and pooling. As the neural network using appropriate activation functions can only have inputs and outputs as a double ranging from 0 to 1, to input an image to a neural network will require some pre-processing on the input end to normalize the pixels into this form.


Fortune 100 passwords corporate secrets left exposed on unsecured Amazon S3 server


Some of the world’s biggest companies have had 750GB worth of their innermost secrets revealed on unsecured Amazon S3 buckets, available for anybody to download – no password required. The startling revelation came from researchers at UpGuard, who discovered three publicly accessible Amazon S3 buckets related to Attunity, a leading provider of data integration and big data management software solutions, on May 13th 2019. The fact that Attunity is at the centre of the security breach is a concern, simply because of its impressive list of customers. On its website, the company boasts that it counts more than 2,000 enterprises and half the Fortune 100 in its customer base. According to screenshots published on UpGuard’s blog, Fortune 100 companies such as Netflix, Ford, and TD Bank were amongst those who had their data recklessly exposed. For instance, the researchers discovered files containing the usernames and passwords of Netflix database systems, and internal Ford presentations.



NotPetya Retrospective

Each of the companies impacted by NotPetya (and WannaCry before it) had some degree of security protection in place—the usual stuff like firewalls, antivirus, and patch management. That defense obviously wasn’t perfect or the attack would have been thwarted, but a perfect defense costs $∞ and is therefore impractical. As we deal with the realities of an imperfect defense, it becomes necessary to choose between preventative and reactive measures. Security expert Bruce Schneier makes the point on his resilience tag: ‘Sometimes it makes more sense to spend money on mitigation than it does to spend it on prevention.’ An investment in mitigation can also pay off in all kinds of ways that have nothing to do with attacks: that change that was just accidentally made to production when it should have been in test—fixed in seconds, by reverting to the last snapshot. NotPetya is unlikely to keep its ‘most devastating cyber attack’ title for long. There will be another attack, and we should expect it to be worse. Moving away from a trusted network model to a zero-trust model is the most effective way to defend against such attacks. But, effort should also focus on measures that allow speedy recovery.


Managing Machine Learning Models The Uber Way

Electric Vehicles
With access to the rich dataset coming from the cabs, drivers, and users, Uber has been investing in machine learning and artificial intelligence to enhance its business. Uber AI Labs consists of ML researchers and practitioners that translate the benefits of the state of the art machine learning techniques and advancements to Uber’s core business. From computer vision to conversational AI to sensing and perception, Uber has successfully infused ML and AI into its ride-sharing platform. Since 2017, Uber has been sharing the best practices of building, deploying, and managing machine learning models. Some of the internal tools and frameworks used at Uber are built on top of popular open source projects such as Spark, HDFS, Scikit-learn, NumPy, Pandas, TensorFlow and XGBoost. Let’s take a closer look at Uber’s projects in the ML domain. Michelangelo is a machine learning platform that standardized the workflows and tools across teams through an end-to-end system. It enabled developers and data scientists across the company to easily build and operate machine learning systems at scale.


Are You Choosing Fintech—or Is Fintech Choosing You?


The type of financial technology solutions that are best suited for any particular institution vary tremendously. Some institutions may be looking to digitize or modernize processes from within, others may be looking to add-on a single solution such as mobile payments. Fintech solutions could also involve data aggregation or lead generation activities as well as arrangements to buy assets, such as small business loans, from leading online lenders. Once the fintech solution is identified, each institution needs to identify the best strategy for itself to either compete or collaborate with emerging players—and capitalize on trends and capabilities to position itself with the most competitive advantage going forward. There are generally two broad strategies that financial institutions can pursue: invest in or build emerging technologies on your own, or buy, partner or network with fintech companies. ... Consider building in-house if there are sufficient internal resources, expertise and scale to innovate and customize unique capabilities. These strategies may be more appropriate for regional or larger banks than smaller community banks.



Quote for the day:


"There are some among the so-called elite who are overbearing and arrogant. I want to foster leaders, not elitists." - Daisaku Ikeda