September 30, 2016

Ransomware Spreads Through Weak Remote Desktop Credentials

According to Kaspersky Lab, the TeamXRat attackers peform brute-force attacks against internet-connected RDP servers and then manually install the Xpan ransomware on the hacked servers. "Connecting remote desktop servers directly to the Internet is not recommended and brute forcing them is nothing new; but without the proper controls in place to prevent or at least detect and respond to compromised machines, brute force RDP attacks are still relevant and something that cybercriminals enjoy," the Kaspersky researchers said in a blog post. "Once the server is compromised, the attacker manually disables the Antivirus product installed on the server and proceeds with the infection itself." 


This Emerging Tech Company Has Put Asia's Tuna On The Blockchain

A British company just ended a pilot in Indonesia, using blockchain to pioneer a new method of traceability when it comes to fishing, which could stop many of the troubling practises within the illegal fishing industry–including slavery. Provenance used mobile, blockchain technology and smart tagging to track fish caught by fishermen with verified social sustainability claims. Their pilot successfully tracked fish in Indonesia from January to June of 2016, and demonstrated not just another digital interface, but a solution to tracking systems and claims securely and without the need for a centralized data management system.


Wealth and Asset Management Report Predicts Blockchain Use by 2021

It is predicted that by 2021, the convergence of these smart technologies will produce a huge impact on the wealth profession, unlocking the doors of global wealth across a diverse universe of investors. However, with a fast-paced marketplace, it is important for investors to understand their customers’ needs and behaviors, and make the necessary technology changes to meet their requirements. Bob Reynolds, President and CEO of Putnam Investments, commented in the report that ‘the business moves in cycles, and some are severe.” ... As a consequence, economist Dr. Nouriel Roubini said in the report that “mediocre growth and low interest rates have become the new normal.”


The Open Group Launches the O-BA Preliminary Standard Part I

Developed by The Open Group Governing Board Business Architecture Work Group, this is the first installment of a three-part standard. Combined, the three parts of the standard will explicitly address all aspects of a business architecture practice. Not only will it examine the holistic approach in modeling required, but also the way of working and thinking, as well organizing and supporting. The standard clearly defines the systemic nature of transformations, the varying interests and goals of stakeholders, and prepares for consistent communication of business priorities and needs throughout the transformation lifecycle. It addresses a real need to solve structural challenges in enterprise and organizational transformations.


How is IoT Paving The Way for the Future

As the tech world is moving towards the cloud, it’s hard to imagine it functioning without IoT. And as we indulge ourselves in our devices and pour large amounts of data in this enormous mesh called IoT, it has swelled up to gigantic proportions. Such a huge system demands an extensive amount of technology and skills in order to sustain itself. But do we have the what it takes to monitor, maintain and secure IoT? According to Nick Jones, VP Distinguished Analyst at Gartner, “A recurring theme in the IoT space is the immaturity of technologies and services and of the vendors providing them. Architecting for this immaturity and managing the risk it creates will be a key challenge for organizations exploiting the IoT. In many technology areas, lack of skills will also pose significant challenges.”


Shutterstock CIO shares SDDC architecture lessons learned

With an SDDC, there are APIs for everything, so I can enable our software deployment for our product. They can have an API through Puppet and deploy through the infrastructure, and we can set up the key metrics, so if we're seeing load increase on our conservative platform, we can automatically expand that, or I can move that up to AWS. I've got some drivers from the leadership team: [They said] 'We want to move to AWS, we want to be faster.' Okay. I would argue that an SDDC makes you incredibly fast when you look at what we need to do as a company and how we need to service dev and products team -- it's that API-driven economy. They just want to be able to fire code out and know that that code gets deployed and we're operating and monitoring it and we're ensuring that stuff is staying up.


WhatsApp’s privacy U-turn on sharing data with Facebook draws more heat in Europe

In the PM interview, Denham was also pressed on whether the ICO is doing anything to stop data flowing now, while it probes the arrangement, but she said she thinks no data is yet flowing from UK WhatsApp users to Facebook. “We are told that data is not yet being shared — so I am hoping that there is a pause in the data-sharing, and some rethinking of the terms and the consent and what data is being shared,” she said. We’ve asked Facebook to confirm whether or not it is harvesting UK WhatsApp data at this point or not and will update this post with any response. Making a general statement about the data-sharing agreement earlier this month, Europe’s Article 29 Working Party ...  asserted that: “Users should keep control of their data when Internet giants massively compile it.”


Why Automation Doubles IT Outsourcing Cost Savings

Automation is having the biggest impact on areas in which employees manage physical devices, such as network services. Most IT towers see an average 25 percent decrease in the number of resources required as a result of automation, but certain IT services experience a 50 percent headcount reduction, according to ISG. ISG found that network and voice costs are declining by 66 percent mostly due to the convergence of voice, video and data solutions built on highly standardized and virtualized capabilities, an environment ripe for leveraging automation. Service desk and end user support costs declined by 26 percent due to increased adoption of self help and remote support, the introduction of self-healing functionality, and significant automation of level one and two incidents.


The Top 10 AI And Machine Learning Use Cases Everyone Should Know About

Machine learning is a buzzword in the technology world right now, and for good reason: It represents a major step forward in how computers can learn. Very basically, a machine learning algorithm is given a “teaching set” of data, then asked to use that data to answer a question. For example, you might provide a computer a teaching set of photographs, some of which say, “this is a cat” and some of which say, “this is not a cat.” Then you could show the computer a series of new photos and it would begin to identify which photos were of cats. Machine learning then continues to add to its teaching set. Every photo that it identifies — correctly or incorrectly — gets added to the teaching set, and the program effectively gets “smarter” and better at completing its task over time.


On Abstractions and For-Each Performance in C#

A common misconception is the foreach loop in C# operates on IEnumerable. That is almost correct, but it actually operates on anything that looks like an IEnumerable. That means it must have a GetEnumerator method and that method must return an object (or struct) with Current and MoveNext methods, the latter of which returns a Boolean. This was necessary back in the .NET 1.x era when we didn’t have generics or IEnumerable<T>. If you used a non-generic IEnumerable to loop over an integer array, it would have to allocate a new object for each item in the array (an operation known as boxing). As that would be ridiculously expensive, they decided C# would look for a custom enumerator first, and if it couldn’t find one then it would fall back on IEnumerable.GetEnumerator.



Quote for the day:


"The Crystal Wind is the Storm, the Storm is Data, and the Data is Life The Players Litany" -- Daniel Keys Moran


September 28, 2016

Data Governance: From Insight Comes Action

On average, a knowledge worker spends 36 percent of their time looking for information. If content creators can’t keep up with the amount of data across apps, how could we expect IT to protect it? This is the paradox of shadow IT: corporate intellectual property stored in many repositories must be secured and protected by IT professionals not consulted by users as they select the apps to work and collaborate on this content. The conversation needs to shift from blocking unsanctioned productivity apps manipulating this content, to intelligently protecting the content at the source (i.e. repository) itself so users who should be able to leverage their favorite apps – even when not approved by IT - but only for the content they are allowed to access to. Information governance is the industry term for having clear processes for users and IT on the secure handling of content.


The Role of Data in Digital Transformation

Disparate data sources are often a barrier to organizations looking to make use of their digital content to gain greater business insights. Only a third of those surveyed see themselves as extremely effective in managing and utilizing digital content and channels, and less than a third reported being “extremely confident” in their ability to integrate all data sources and applications. Many organizations have taken to storing data in data lakes, which, put simply, are archives that store a tremendous amount of raw data in its native format (whether structured, unstructured, or semi-structured) for as long as it needs to be held for analysis purposes. However, as the business ingests new types of poly-structured data, it can become increasingly difficult to make sense of it without accessing all of the data stored in the various sources.


73% of companies using vulnerable end-of-life networking devices

Old equipment that is no longer supported by the vendors who made it are vulnerable because newly-discovered vulnerabilities and other problems are not being patched. That puts those companies at higher risk of security breaches, network outages and higher future replacement costs. "If its an older device, there are vulnerabilities against it," he said. But companies often keep the older equipment around because it still works. "If something isn't having an issue, we tend to forget about it," Vigna said. "If there isn't pain, there isn't a reason to change a lot at companies." In addition, the companies might not even be aware that some of their equipment is past its due date.


In The “Second Wave” Of Cloud Computing, Hybrid Cloud Is The Innovator’s Choice

Hybrid is the palette they’re painting with, best expressed by the analysts at Frost and Sullivan. “At their core, successful hybrid cloud strategies support the delivery of high-value applications and services to the business, while at the same time driving cost and inefficiency out of the IT infrastructure,” the study said. Fine, but how does adopting a hybrid cloud strategy support business success, particularly as we enter the era of cognitive computing? Successful organizations provide the answer. They aren’t adopting cloud technology for its own sake. Instead, they’re pursuing a business strategy that’s equally about transformation and industry disruption.


Improve application rollout planning with advanced options

One drawback of canary deployment to consider during application rollout planning is the time it takes to complete an update, as the new version is tested and phased gradually into production. This means the application owners must manage more than one version simultaneously, and it demands careful change and version management on the part of IT operations staff. The incremental increase in usage allows ample opportunity to gather load metrics, however, allowing production IT capacity planners to see how load demands change with the updated code. And the canary process provides a relatively safe and rapid rollback process if unintended consequences occur.


BTCPoint Creates 10,000 Bitcoin-Enabled ATMs Using Spanish Bank Network

To access the service, BTCPoint users enter the amount of money they'd like to withdraw from an ATM using the application and send bitcoin to a company address. Next, users receive an SMS and a PIN code, input the PIN code into an ATM on the network and withdraw their funds. The service today is one-directional, with users only being able to withdraw cash from units, though Lopera said BTCPoint is working on solutions that diversify its service. "We are focusing on changing bitcoin into cash, and we’re also talking with different credit card processors, who could enable the buy option so you can buy at a very low fee," he said. Lopera suggested BTCPoint is in talks with US and Latin American banks as a means to expand its service.


Enabling a digital future requires smart capital strategy.

Executives recognize that digital transformation is impacting all aspects of their business — from the front - end to the back. They also know that the competitive landscape is changing rapidly as barriers to entry are eroded. Digital is a continuous form of disruption to existing (or new) business models, products, services or experiences, enabled by data and technology across the enterprise. The key challenge for many companies will be a lack of sufficient capital to meet their digital ambition. Enabling a digital future requires smart capital allocation. Selecting the right strategic investments — organic or inorganic — offers routes to growth. The key question is can companies build the capabilities required to succeed in the brave new world — or do they need to buy?


Why London will remain a global tech hub post-Brexit

London’s corporate tech base is impressive and should also help maintain its position. The presence of big global tech companies, such as Google, Amazon, Facebook, Microsoft and Yammer, provide a solid foundation for the tech sector, while fast-growing smaller companies, such as Skyscanner, Badoo, Hailo and Mind Candy, provide that drive for innovation for which London’s tech sector is renowned. Accelerator programmes to help the Capital’s tech start-ups expand and succeed are another factor why the doomsayers over London’s tech future are wrong. Currently there are around 4,000 start-ups in the Capital and about 40 accelerator programmes, such as Seedcamp, TechStars, Wayra and Oxygen, help foster these young firms and buoy their growth.


3 Big Trends in Business Intelligence and Analytics

Lack of good, consistent quality data is cited as the number one challenge organizations face to realizing the full potential from analytics (A.T. Kearney’s “2015 LEAP Study - Leadership Excellence in Analytic Practice”). Excessive time and resources are needed to manipulate and “roll-up” data before business analysts can start to use it for reports, analytics and insights. Often these challenges are compounded when analysts create work-arounds that drive “shadow” data bases and ad hoc data management processes that undermine confidence in the data. Strong business intelligence can become the data syndication traffic cop and data clearing house for enterprises that need to make better, faster decisions using good quality data and insightful analytics.


Traffic Data Monitoring Using IoT, Kafka and Spark Streaming

In order to process the data generated by IoT connected vehicles, data is streamed to big data processors located in the cloud or the data centres. An IoT connected vehicle provides real time information of the vehicle like speed, fuel level, route name, latitude and longitude of vehicle etc. This information can be analysed and data can be extracted and transformed to the final result which can be sent back to the vehicle or to a monitoring dashboard. For example, using the information collected for different vehicles we can analyse and monitor the traffic on a particular route. In this article, we’ll use Apache Spark to analyse and process IoT connected vehicle’s data and send the processed data to a real time traffic monitoring dashboard.



Quote for the day:


"Any sufficiently advanced technology is indistinguishable from magic." -- Arthur Clarke


September 27, 2016

Why Hire a Corporate Lawyer When a Robot Will Do?

Well-tuned search engines could save people a lot of time and suffering. Luminance promises to increase the efficiency of contract review by at least 50 percent. Kira Systems claims a time reduction of as much as 90 percent. If Bayer’s legal team had included robot lawyers, maybe they could have completed due diligence for the Monsanto deal in days. So will the associate attorney, among the least satisfying jobs in the U.S., become a thing of the past? Not necessarily. Even though automated-review tools are great for organizing documents into actionable information, intelligent humans are required to step in when the computer encounters ambiguous language or unexpected cases. It’s like how self-driving cars still have human supervisors in the vehicle to deal with rogue squirrels or trolley problems.


One Fantastic Keyboard For Your Compiter, Phone And Tablet

This full-size, six row keyboard features a complete set of function keys and a number pad. Certain keys—such as Alt and Ctrl—will automatically change functions depending on which operating system you’re working in. Running the length of the keyboard is a rubberized tray that holds your mobile devices at the correct viewing angle. You can link up to three devices at a time to the K780 via Bluetooth. Those without Bluetooth can connect via Logitech’s Unifying USB dongle. At the top left of the keyboard are three white buttons used to pair your devices. Switching between paired gadgets is as simple as tapping the corresponding button.


How to succeed with hybrid cloud application integration

The biggest mistake you can make in hybrid cloud integration is overspecializing. You should establish a common network connection model across your entire hybrid cloud and then work to define a standardized hosting model to deploy applications/components. The connection model issue can only be addressed by creating a virtual private network that can host all of the applications and components. Enterprises are increasingly looking to adopt software-defined or virtual networks as their connectivity core, and if the proper software-defined network or software-defined wide area network model is adopted, it can connect everything, whether in the cloud or the data center. There's no substitute for open uniform connectivity, so it's critical to get this right, and enterprises are recognizing that the basic cloud networking tools are best used to supplement this enterprise virtual network, not create it.


Government lawyers don’t understand the Internet. That’s a problem.

Today, cyber, data and privacy questions lie at the core of numerous corporate and government cases, and there aren’t anywhere near enough practicing lawyers who can adequately understand the complex issues involved, let alone who can sufficiently explain them in court or advise investigators on how to build a successful case. “This is a problem that pervades all of the national security apparatus,” says Alvaro Bedoya, who previously worked as the chief counsel to the Senate Judiciary Committee’s subcommittee on privacy, technology and the law, and now leads Georgetown Law’s Center on Privacy & Technology. “You don’t have a pipeline of lawyers right now who can read code.”


Your users have porous passwords? Blame yourself, IT.

Maybe IT needs to tone down its security awareness efforts. New research by psychologists into password strength delivered the non-intuitive conclusion that users who are well briefed on the severity of security threats will not, as IT had hoped, create stronger passwords to better protect themselves. They actually tend to create much weaker passwords because the briefings make them feel helpless, as if any efforts to defend against these threats are pointless. The research, from a Montclair State University study — detailed here in a story from The Atlantic — suggests that IT staffers need to make sure that they emphasize how powerful a defense passwords, PINs and secure phrases can be in defending against threats, at least until we are able to deploy better authenticators.


Psychology Is the Key to Detecting Internal Cyberthreats

The key to identifying and addressing at-risk employees before a breach or incident occurs is to focus as much on understanding and anticipating human behavior as on shoring up technological defenses. The best way to do this systematically is by analyzing employees’ language continuously and in real time, in a way that still respects privacy. And, the data is readily available to do so because email, chat, and texts are now one the most common methods of communication in business. ... The opportunity for using psychological content analysis in the corporate workplace is vast. Not only can leaders utilize this to intervene before a security breach, but leaders can also use insights to support other efforts to build a healthier culture and develop the organization’s talent.


Mood of the Boardroom: Hacking a serious business

The fact that cybersecurity now ranks alongside what have long been seen as the world's greatest challenges is telling. A real estate director said, "Both terrorism and cybersecurity are always cause for concern of the highest level, as we do not know when and where it will next hit." In light of the increasing acknowledgement of the risk, there are opportunities for the businesses that help address it. Kordia acquired Aura Information Security, a leading cybersecurity company, for just over $10m in late 2015. Bartlett sees addressing cybersecurity threats as a potential selling point for New Zealand. "We are small enough to make our little country a stand-out example of how to get it right," he said. "If we can, our cyber-safe brand will be as important as, and more credible than, 100 per cent Pure New Zealand."


Companies say IoT matters but vary on how to secure it

Overall, their biggest challenges in deploying IoT revolved around security and privacy. But most are taking an “ad hoc” approach to security, doing things like securing individual devices using firewalls. However, 23 percent said they are integrating security processes into their IoT workflow. No single approach has won out yet, MacGillivray said. Finding people with the right job skills is another thing that makes IoT difficult, respondents said. That's a pain point especially in terms of crunching all the data that flows in from the new systems. Also, most enterprises haven’t taken advantage of edge computing, which may be one of the most important parts of IoT, according to IDC. A majority of organizations that have deployed IoT devices just use them to collect data and send it to the cloud or a data center for processing.


How To Mitigate Hackers Who Farm Their Victims

The farming is more sophisticated now with advanced Command and Control (C&C) servers that they use to make system changes remotely, multiple backdoors in multiple systems, bogus accounts they create to sell or reuse, and sensors they leave behind to identify and harvest specific data, says Inskeep. Command and control servers work by receiving communications from malware-infected systems that call out to the internet via outbound network traffic. This works because most network security is geared to defend against what is coming in, not what is going out. Hackers can spread large numbers of Trojans into different kinds of systems because they can pair these backdoors with many different kinds and pieces of legitimate software from OS and application updates to games.


Is the internet of things the new DDoS attack weapon?

It’s been posited that attackers are leveraging internet of things (IoT) devices to grow their botnet capacity to this new level, which in itself is troublesome, but first, the backstory. Krebs is one of the most prolific cybersecurity-focused investigative journalists and has broken a number of high-profile stories and been responsible for numerous arrests over the years. As a result of his intrepid work, Krebs has come into direct contact with plenty of criminal gangs and met the perpetrators of many of the world’s most notorious cybercrime fraternities face to face. Speculation that this is why his site was attacked has stemmed from his recent coverage of an Israeli online DDoS attack service called vDOS – still available to read via Google’s webcache.



Quote for the day:


"Optimism is the faith that leads to achievement. Nothing can be done without hope or confidence." -- Helen Keller


September 26, 2016

Why CIOs are embracing SaaS ID management

Shelving several legacy ID management products with one single sign-on tool is a common business case for Okta, as well as rival solutions from Centrify, OneLogIn and Ping Identity, says Gartner analyst Gregg Kreizman. Such solutions also compensate for companies' inability to retain skilled IT workers schooled in traditional ID management.  In 2016, Experian CTO Joe Manna began testing Okta for a mobile app that enables consumers to access their credit reports. Manna told Libenson both the software and company were great to work with so Libenson instructed his staff to use Okta to manage Experian identities worldwide across cloud, on-premises and mobile applications, including authentication into its core Oracle ERP system.


IT operations automation requires code-wielding sys admins

Once, IT delays were caused by waiting for deliveries and hardware installation; today, an administrator who is taking too much time to deploy VMs is the problem. Using a graphical user interface (GUI) for IT tasks simply takes too much time. Administrators are asked to manage hundreds to thousands of VMs thanks to the explosive growth in virtualization and the VM sprawl that accompanied it. This has led to growth in automation to help admins cope with these tasks and duties. While some level of IT automation has existed for years, it was often smaller scripts and batch jobs that took care of a few stand-alone tasks. Today, automation has become a critical part of data center operations as our applications scale out while staffing stays the same.


Blockchain-Based Smart Identity Will Free World of Paper ID’s

An interesting facet of the Deloitte project is that Smart Identity as a protocol is portable on different Blockchains while the current version of the prototype has been using Ethereum Blockchain. We also asked Deloitte about who is going to hold the actual data and we were told that there is a number of trusted data repositories available but there is also scope for using a hybrid model with a network of trusted custodian services as well as distributed data services in the future. ... In order to migrate from the current system of paper based identity that we have today, there will invariably be the need for all parties involved like governments, corporations and individuals to work in tandem. Cointelegraph asked Shelkovnikov about the issue of provenance of identification and how it would all work.


The Emergency Alert System: Failure IS an option

While many reported a complete failure of the EAS, the FCC issued a report showing that the failure rate was close to 18 percent. Certainly not perfect, but when coupled with social media and other forms of communication that would likely have been deployed on an individual basis, I believe it's safe to assume the word would have gotten out in a timely fashion. The biggest problem I saw with the test was significant technology inconsistencies with what people heard and saw. Some stations showed the textual message but no audio and no alert tones, certainly a problem for someone who was blind, while other stations broadcast the audio for the emergency messages but did not show the text for those messages, leaving a person who is deaf or hard of hearing completely unaware of the situation at hand.


Why Amazon can't possibly be the only cloud winner

Enterprises have many workloads. Some workloads run best on one specific public cloud or another. For example, we've found that Google has far and away the best internal network performance. So, a network-intensive workload should probably run there. Microsoft has GPUs available in Azure for video rendering and HPC workloads. Amazon does a very good job at storing files and objects and distributing them globally for fast access. Enterprises also have troves of legacy data and applications. The reality of IT is that 90% or more of the budget goes to maintenance. That leaves very little for new development. Enterprises choose between migrating legacy applications to the cloud and writing new applications.


Robotic process automation technology gets to work

RPA, or robotic process automation, has a sexy ring to it these days, especially in the C-suite and company boardrooms. And why not? There's a lot about this emerging technology to pique a boss' interest. Robotic process automation technology -- defined in simple terms as software that automates other software -- promises to improve efficiency, boost productivity and save money by helping with or entirely replacing the manual, routine and often error-prone digital processing jobs still done with human labor at many companies. ... The software robots of RPA ilk -- virtual workers, if you please -- interact with computer systems the way most employees do, at the presentation layer through the user interface, requiring minimal code-based programming or deep back-end systems integration.


802.11ad is the fastest Wi-Fi that you might not ever use

“To date, the Wi-Gig products that are shipping in the market have been largely confined to peer-to-peer applications. Once infrastructure mode is widely available on Wi-Gig capable clients, enterprise radio vendors will rapidly follow,” he said. And while Qualcomm’s Grodzinsky hinted at major product releases coming within the next couple of weeks, nobody is particularly clear on the timeframe for widespread 802.11ad adoption, whether as a traditional Wi-Fi technology or, as Forrester’s Kindness suggests, as a wireless backhaul carrier. For the enterprise IT department, Kindness argues, it’ll be three years before decision-makers really need to get their arms around 802.11ad. “It takes about a year to two years to become mainstream, because it doesn’t have product support, you have to understand where you’re going to use it,” he said.


Biometric Skimmers Pose Emerging Threat To ATMs

The devices apparently act just like regular skimmers do in stealing payment card data. They are designed to connect physically to a target ATM and to steal fingerprint data that users may be required to input while authenticating their identity with the device. The stolen data can then be used to authorize other fraudulent transactions, the researchers say. Available evidence suggests that the first wave of biometric skimmer machines, which surfaced last September, were buggy and had to contend with multiple issues during initial tests in the European Union. The biggest hurdle apparently was the fact the GSM modules that the underground sellers used in their skimmers for transferring stolen biometric data, and were too slow to handle large data loads.


Data Interchange Flexibility

JSON and XML are two complementary standards, each suited to different situations. JSON’s popularity is in no small part owing to the fact that it is built into JavaScript. That is JavaScript can read JSON directly without any additional parsing. This is a huge convenience for JavaScript developers. Given that it is also less verbose than XML, it is the often logical choice for sending transient data between the client and server layers within many web applications. Whilst being more verbose, XML offers many other advantages. For example, XML schemas allow one to describe, extend, communicate and validate XML datasets. XSLT allows for easy transformation of XML from one format into another, and XPath/XFormat engines allow for deep querying of native XML files.


The Internet of Things is broken. We need better security to fix it

Even if individual devices are designed with device-level security, an interconnected architecture may still expose vulnerabilities. Electronic devices in general have accessible interfaces such as JTAG ports and MAC addresses that provide an increased 'attack surface' and make devices vulnerable to invasive attacks that reverse engineer security. Also, devices invariably share components and firmware across product lines, allowing a vulnerability detected in one system to be exploited in another one using the same chipset. Most IoT systems also have field sensors that can be subject to physical security issues: critical sensors can malfunction if subjected to higher operating temperatures or voltage ranges. They can simply be vandalised, or even replaced with rogue devices connected to a cybercriminal’s Bot network.



Quote for the day:


"Give your past a Teflon coating. Be honest with yourself and others making sure you’ve fully let go of the past." -- Karen Keller


September 25, 2016

By starting with a contract or test case that is well understood to incrementally test a feature requirement, you ensure that as a small iterative unit of work completes, it meets that contract in such a way that it is releasable, deployable software. Exploratory testing tools for new feature development come to play as do coverage tools that send data showing anomalies between releases back to the quality process. Coveralls.io is a great tool that’s easy to configure and has wonderful visualizations for the most popular languages, while Jenkins has a highly customizable dashboard. ... Technology can’t solve all problems, however, so developers and testers will need to change some of their workflows to master CT. These concepts are closely linked to the Agile and DevOps practices you are probably already using, so adapting testing in this way should not be a huge shift.


Google Allo: Don't use it, says Edward Snowden

Allo does support end-to-end encryption, which should make it difficult for anyone but recipient and sender to view the contents of messages; however, Google was criticized by Snowden and other privacy advocates for setting it as off by default. Allo relies on the encryption protocol used by Signal, which Snowden has vouched for as a private messaging app, but in Allo it is only active when users are in Incognito Mode. "We've given users transparency and control over their data in Google Allo. And our approach is simple -- your chat history is saved for you until you choose to delete it. You can delete single messages or entire conversations in Allo," Google said in a statement toTechCrunch.


Investing in AI offers more rewards than risks

While some may argue it’s impossible to predict whether the risks of AI applications to business are greater than the rewards (or vice versa), analysts predict that by 2020, 5 percent of all economic transactions will be handled by autonomous software agents. The future of AI depends on companies willing to take the plunge and invest, no matter the challenge, to research the technology and fund its continued development. Some are even doing it by accident, like the company that paid a programmer more than half a million dollars over six years, only to learn he automated his own job. Many of the AI advancements are coming from the military. The U.S. government alone has requested $4.6 billion in drone funding for next year, as automated drones are set to replace the current manned drones used in the field.


Crowdsourcing Data Governance

This variation of context is why the right operating model set up is so important for any data governance initiative, especially the ones that are just getting started. A successful data governance initiative will bring change, and so time becomes yet another dimension for the context. I’ve seen it happen many times: organizations launch with a best-in-class operating model to drive their stewardship. They gain adoption, and the resulting change makes the original operating model obsolete, or rather stretches it to the limit. This is why I am absolutely convinced that a data governance platform that aims to be successful needs a capability for operating model configuration: your roles, responsibilities, workflows, dashboards, views, use cases, and more.


Bossie Awards 2016: The best open source application development tools

or years and years, we’ve been building applications that collect data from the users and serve it back to them. We’re finally starting to do something with that data. Along with the best open source tools for building web apps, native apps, native mobile apps, and robotics and IoT apps, this year’s Bossie winners in application development include top projects for data analysis, statistical computing, machine learning, and deep learning. After all, if our applications can be reactive, responsive, and even “ambitious,” they can also be intelligent.


Is this the age of Big OLAP?

What has dogged OLAP, though, is its scalability. Most OLAP servers run on single, albeit beefy, servers, which limits the parallelism that can be achieved and therefore imposes de facto limits on data volumes. Customers who hit these scalability ceilings may contemplate using Big Data technologies, like Hadoop and Spark, but those tend not to employ the dimensional paradigm to which OLAP users are accustomed. What to do? Well, a few vendors have decided to take Hadoop and Spark, and leverage them as platforms on which big OLAP cubes can be run and built. ... Their approach has been to let people in those enterprises work in the OLAP environments they are comfortable with and, at the same time, make use of their Hadoop clusters.


Deep Learning in a Nutshell: Reinforcement Learning

Reinforcement learning is about positive and negative rewards (punishment or pain) and learning to choose the actions which yield the best cumulative reward. To find these actions, it’s useful to first think about the most valuable states in our current environment. For example, on a racetrack the finish line is the most valuable, that is the state which is most rewarding, and the states which are on the racetrack are more valuable than states that are off-track. Once we have determined which states are valuable we can assign “rewards” to various states. For example, negative rewards for all states where the car’s position is off-track; a positive reward for completing a lap; a positive reward when the car beats its current best lap time; and so on.


NYDFS Proposed Cybersecurity Regulation for Financial Services Companies

The goal of the Proposed Regulation is to secure “Nonpublic Information” from misuse, disruption and unauthorized access, and as noted above, such information is defined broadly. It includes not only competitively sensitive information and intellectual property, but also numerous categories of information that a Covered Entity receives from or about consumers, including information considered nonpublic personal information under the GLBA Privacy Rule. ...” When something goes wrong, the Covered Entity must report it to the Superintendent. Specifically, any attempt or attack “that has a reasonable likelihood of materially affecting the normal operation of the Covered Entity or that affects Nonpublic Information” must be reported to the Superintendent within 72 hours after the Covered Entity becomes aware of the event.


Big Data Processing with Apache Spark - Part 5: Spark ML Data Pipelines

Machine learning pipelines are used for the creation, tuning, and inspection of machine learning workflow programs. ML pipelines help us focus more on the big data requirements and machine learning tasks in our projects instead of spending time and effort on the infrastructure and distributed computing areas. They also help us with the exploratory stages of machine learning problems where we need to develop iterations of features and model combinations. Machine Learning (ML) workflows often involve a sequence of processing and learning stages. Machine learning data pipeline is specified as a sequence of stages where each stage is either a Transformer or an Estimator component. These stages are executed in order, and the input data is transformed as it passes through each stage in the pipeline.


How fintech startups can disrupt the financial services industry

Successful fintech startups will embrace “co-opetition” and find ways to engage with the existing ecosystem of established players. E.g. PayPal partners with Wells Fargo for merchant acquisition. Some business lending platforms enable banks to participate as credit providers on their platforms. Conversely, some banks partner with P2P lending platforms to provide credit to those borrowers who would ... Fintech startups are flying under the regulatory radar so far. However that may change in the near future. Regulatory tolerance for lapses on issues such as know your customer, compliance, and credit-related disparate impact will be low. Experience of microfinance industry in many developing countries the past is a good indicator of the high impact of regulation on an unregulated industry.



Quote for the day:


"It is better to be defeated standing for a high principle than to run by committing subterfuge." -- Grover Cleveland


September 24, 2016

Implementing DevOps starts with rethinking deeply-rooted processes

DevOps expands enterprise agile from product management and development all the way to IT operations. We’ve done enterprise agile. We now create apps in a way that focuses on the client and maximizes throughput and quality. But that hits a barrier when you get to traditional IT operations. The changes that will be taking place over next five years create the ability to take enterprise agile into IT operations. It means improved platforms, improved automation, improved collaboration across development and ops. It means a constant flow of value into production. ... Rather than throwing large releases over the wall to operations, how can we bring teams together to identify tooling, processes, and practices that can be deployed to automate provisioning and production deployment fully? This significantly improves time to market by increasing throughput and quality.


I got 99 data stores and integrating them ain't fun

The concept has been around for a while and is used by solutions like Oracle Big Data. Its biggest issues revolve around having to develop and/or rely on custom solutions for communication and data modeling, making it hard to scale beyond point-to-point integration, Could these issues be addressed? Data integration relies on mappings between a mediated schema and schemata of original sources, and transforming queries to match original sources schema. Mediated schemata don't have to be developed from scratch -- they can be readily reused from a pool of curated Linked Data vocabularies. Vocabularies can be mixed/matched and extended/modified to suit custom needs, balancing reuse and adaptability.


The next target for phishing and fraud: ChatOps

Like many cloud platforms, chat tools allow external organizations to leverage internal APIs to extend functionality, ranging from scheduling assistants to travel booking tools to various engineering and product management systems. Overall, this extensibility represents a core strength of these systems. From a security perspective, however, they can represent data exfiltration opportunities that must be addressed. First, not every third party company is a good steward of the data they have access to; corporate policies for vendor review and acceptable use should apply to chat programs in the same way that they do for any system. As with the GSA example, relying on users to understand the technological limitations and risks around connecting technologies is not a strong strategy.


Why Fintech has made finance courses obsolete

Today, it’s a lot more complicated because we don’t know what will be the jobs of the next 10 or 20 years. So it’s a lot harder to be passive, and I think you have to be a lot more active. As a piece of advice, if I were 20, there are two things I would do. The first thing, from an education standpoint, you have to learn something, but you have to learn something you like. Just because you want to learn fintech doesn’t mean you have to code. So you really have to learn what you like on the education standpoint, and the second thing, I think it’s about the mindset. It means that to avoid being passive but be very active, for me the best quality to have today is being able to think like an entrepreneur. You don’t necessarily need to be an entrepreneur, you may work for a big company, but you have to think like an entrepreneur.


Computers could develop consciousness and may need 'human' rights, says Oxford professor

Advances in artificial intelligence could lead to computers and smartphones developing consciousness and they may need to be given ‘human’ rights, an expert has claimed. Marcus du Sautoy, who took over from Richard Dawkins as Professor for the Public Understanding of Science at Oxford University in 2008 said it was now possible to measure consciousness and, in the future, technology could be deemed to be ‘alive.’ Most scientists believe that computers are close to getting to a point where they begin to develop their own intelligence and no longer need to be programmed, an event dubbed the ‘technological singularity.’


Why CMOs need to care about security like CIOs

While some marketers will view consumer grade file and sync platforms such as Dropbox or WeTransfer as a swift business panacea, the risk that these platforms open up for data breaches with uncontrolled sharing are high. Today, the ‘data perimeter’ – the boundary that safeguards an organization’s sensitive data – has shifted considerably. This is a result of a more mobilized workforce and greater collaboration with external partners. In the past, when most workers only accessed company information from within the four walls of the business and data was saved on shared drives from PCs located in the enterprise, the perimeter was the firewall. Since the advent of cloud computing, this has changed. In today’s connected world, the data perimeter needs to reside within individual documents, in addition to within the IT infrastructure.


Three Industrial Internet of Things (IIoT) myths that need busting

Unlike consumer markets where standardisation - formal or by market dominance - is key to success, IIoT standardisation won’t be a concern for decades. Sure, there are multiple emerging standardisation initiatives in IIoT and yes, it’s not yet possible to know which will grow or be marginalised. But it doesn’t matter. Unlike consumer markets where new standards for say NFC chips in smartphones can roll out and get near full market presence in the few years it takes for people to replace their phones, industries are run on equipment that is anything from years to several decades old. This equipment has been provided by tens, or hundreds of different suppliers.


It’s time for drivers to learn new skills

Experts predict that 6% of all jobs in the US will be gone by 2021 due to automation. The former CEO of McDonald’s sees replacing the whole company’s restaurant workers as a simple question of economics. From software and legal help to sports reports and parcel delivery, there’s few jobs that won’t see some sort of reduction in the world force. But the world’s drivers could be most at risk. Despite continued fear from the public, the money men at taxi, logistics, and delivery companies will have no such fear at deploying autonomous vehicles on the road. No wages, greater fuel efficiency, no worries about shift work or rest stops. It might be cold, but in terms of business sense it’s hard logic to argue with.


How Li-Fi Will Disrupt Data Centres ‘Very Ugly Radio Environment’

“Most of that is using fiber optic cables and not free space transmission (which Li-Fi seems to be). The total capacity in a data centre far exceeds anything that could be done by a shared system (same is true of radio versus use of copper cables).” He continued to say that there is also a need for shared management communications within the data centre (things like DCIM where someone wants “out of band” communications with the hardware, for example). Giving an example of recently carried out research around the use of Li-Fi in the data centre, Christy mentioned Microsoft’s innovative work where the tech giant complemented the “wired” network with a broadcast network (in the data centre) that could be implemented either with radio or which light transmission bounced off the ceiling.


Industrial IoT is inching toward a consensus on security

Immature security is the biggest thing delaying adoption of industrial IoT, said Jesus Molina, co-chair of IIC’s security working group, in an interview. Components commonly used in enterprise IT security, like identity and root of trust, don't really exist yet in IoT, he said. There are several components to making anything in IoT trustworthy, the framework says: safety, reliability, resilience, security and privacy. These issues come up because industrial IoT connects so many components, including things like sensors and actuators at the edge of an enterprise, that didn’t exist or weren’t connected to the internet up until now. Those edge connections can open up dangerous vulnerabilities, because they’re often designed to carry some of the most sensitive information in an organization.



Quote for the day:


"He who rejects change is the architect of decay." -- Harold Wilson


September 23, 2016

Infrastructure as code: What does it mean and why does it matter?

Code forms the backbone of this approach, giving rise to the term infrastructure as code (IaC), which, in simple terms, means code that helps in provisioning systems out onto an IT platform.  Today, IaC has grown to be full-function and highly flexible, and there are several variants to consider, including declarative, imperative and intelligent IaC. The declarative approach creates a required state and adapts the target infrastructure to meet those conditions, while the imperative version creates a target environment based on hard definitions set out within the script. The intelligent state, meanwhile, takes into account other pre-existing workloads within the target environment, and reports back to a system administrator about any problems it encounters.


Open source technology gains steam in data center, but challenges loom

When deploying or running open source technology, the lack of professional support can leave IT scrambling. Even after combing through search engine results and discussion boards, admins still might not have an answer for an urgent question. Professional support is lacking with open source tools, and although some vendors offer support services, they often comes at a cost. When a primary driver to switch to open source is the financial aspect, spending money on the necessary support can create a dilemma. Some larger companies have the resources -- both from a financial and staffing standpoint -- to support open source hardware and software in the data center, but smaller organizations often struggle to do so.


Can Armies Of Interns Close The Cybersecurity Skills Gap?

Since cybersecurity is a relatively new field, professionals in the sector tend to pick up expertise on the job. It's only more recently that universities have started seriously ramping up programs. But BullGuard finds that's been happening internationally, not just in the U.S., so it's making moves to tap into those talent pipelines pretty much as soon as they're constructed. With its new Romania-based internship program, Lipman explains, "We took computer science students with cyberexperience in their college studies, and put them into our more innovative projects over the summer. It’s been a real win-win. We get access to new blood [and] fresh thinking, the interns get valuable real-world experience, and we build a relationship with the university." Establishing this ability to "hire straight out of college,"


CQRS for Enterprise Web Development: What's in it for Business?

The CQRS pattern is widely acclaimed by advocates of Domain Driven Design. The approach emphasizes solving business problems in the first place during the implementation of an application. It centers on thorough elaboration of a business domain and the context within which it will function. The possibility to focus on the business first rather than on the technical issues and work out all the nuances pertinent to a specific domain is achieved through the use of the Ubiquitous language – a single language understood by an implementation team, business analysts, domain experts and other parties involved. The language helps to share the effort among all team members – business and technical – who define and agree on the use of common business objects to describe the solution’s domain model and a certain context within such a model.


The changing data protection paradigm

The amount of new data available is staggering. As the Harvard Business Review aptly put it, "More data cross the internet every second than were stored in the entire internet just 20 years ago." This data has varying degrees of value and sensitivity, and resides on a variety of systems, including endpoints, removable media, local servers, cloud servers, and cloud-based services like Box and Dropbox. This growth and spread of data has quickly exceeded the ability of most companies to keep track of it, let alone protect it. This massive influx of data, spread out among various locations, has naturally brought with it increasing security exposures, leading to an almost daily data breach crisis.


NIST launches self-assessment tool for cybersecurity

It's designed to walk organizations through the process of figuring out "how to integrate cybersecurity risk management ... into larger enterprise business practices and processes," Matthew Barrett explained to FedScoop. Barrett is the program manager for the NIST Cybersecurity Framework — a document that catalogues the five areas of cybersecurity every company needs to know: identify, protect, detect, respond and recover. ... "The self-assessment criteria are basic enough that they could apply to organizations of any size," said Barrett. But critics aren't so sure. Larry Clinton, founder and CEO of the alliance, called the excellence builder "a pretty sophisticated tool," but added that meant it was really most useful to larger enterprises.


Ideas for Filling the Cybersecurity Skills Gap

Studies over the years show the struggle in building an IT security staff. For example, a GAO survey earlier this year of federal agencies' CISOs reveals their difficulties in recruiting, hiring and retaining security personnel. Wilshusen says the problem of maintaining a sufficient security staff makes it more challenging for agencies to effectively carry out their responsibilities. In building the federal government's cybersecurity workforce, Pritzker suggests the commission consider recommending a centralized system to recruit, train and place federal cybersecurity personnel as well as creating specialized pay scales to compete with the private sector. "We need to rethink recruitment with bold ideas like debt forgiveness for graduates of certified programs, tuition-free community college in return for federal service and cybersecurity apprenticeships within civilian agencies," the Commerce secretary says.


This Is how you stop ignoring the employee voice

In case you forgot, your employees are human. They are all living, breathing, feeling beings who deserve a bit of human interaction. Take the time to meet regularly and face-to-face with your employees. This not only gives you and your team members a chance to catch up on their performance, but also allows employees to share opinions or issues they are facing. Airing those grievances face-to-face lets employees see their manager’s reaction, as well as have an immediate discussion about what can and will be done. Now you might be thinking, “But an email thread is sooo much easier!” It’s also lazier. And might be negatively affecting employee engagement.


Serverless Architectures: The Evolution of Cloud Computing

Serverless architectures are a natural extension of microservices. Similar to microservices, serverless architecture applications are broken down into specific core components. While microservices may group similar functionality into one service, serverless applications delineate functionality into finer grained components. Custom code is developed and executed as isolated, autonomous, granular functions that run in a stateless compute service. ... For a serverless architecture, the “User” service would be separated into more granular functions. In Figure 2, each API endpoint corresponds to a specific function and file. When a “create user” request is initiated by the client, the entire codebase of the “User” service does not have to run; instead only create_user.js will execute.


Why Red Hat is misunderstood amid public cloud worries

Any customer that bets on a cloud stack and uses proprietary APIs is going to have some form of lock-in. That's why OpenStack is such a popular movement. Now let's take those nuances back to Red Hat. "What we're seeing is that large customers see value in running everywhere," said Whitehurst. "These customers want a standard operating environment and want to take Linux with them as they go cloud." Worrywarts about Red Hat will argue that a move to the public cloud means that AWS will get the Linux business. Not necessarily. "As more goes to the public cloud the more relevant we get," Whitehurst argued. "If you are moving to Amazon you have to architect it so you're not locked in. Large enterprises feel burned out about being locked in."



Quote for the day:


"Not all of us can do great things. But we can do small things with great love." -- Mother Teresa


September 22, 2016

Over 6,000 vulnerabilities went unassigned by MITRE's CVE project in 2015

Why does MITRE not have assignments for vulnerabilities identified via other sources? Why haven't the CNAs shared their own disclosures with MITRE so that CVE can reflect the information, instead of leaving entries in RESERVED status, which shows nothing? Why aren’t CNAs assigning IDs to all of the vulnerabilities they disclosed, since some of the unassigned vulnerabilities are in their products? VulnDB shows 14,914 vulnerabilities disclosed in 2015. Within that set, only 8,558 vulnerabilities have CVE-IDs assigned to them. That leaves 6,356 vulnerabilities with no CVE-ID, and likely no representation in a majority of security products. ... While these numbers are bad, what's worse is that the industry has already felt the impact of an attack against a vulnerability that wasn't assigned a CVE-ID.


EastWest Institute Launches Cybersecurity Guide for Technology Buyers

“As cybersecurity vulnerabilities continue to increase, every corporation and government needs guidance to better understand the impact of their purchasing decisions on the security and integrity of their enterprises,” said Steve Nunn, CEO and President, The Open Group. “Every organization should be questioning their suppliers concerning risk management, product development, cyber and supply chain security and best practices. This Buyers Guide supports conformance with international standards and, where appropriate, process-based certification programs that help answer some of these critical questions.”


Lockdown! Harden Windows 10 for maximum security

Windows 10 also introduces Device Guard, technology that flips traditional antivirus on its head. Device Guard locks down Windows 10 devices, relying on whitelists to let only trusted applications be installed. Programs aren’t allowed to run unless they are determined safe by checking the file’s cryptographic signature, which ensures all unsigned applications and malware cannot execute. Device Guard relies on Microsoft’s own Hyper-V virtualization technology to store its whitelists in a shielded virtual machine that system administrators can’t access or tamper with. To take advantage of Device Guard, machines must run Windows 10 Enterprise or Education and support TPM, hardware CPU virtualization, and I/O virtualization. Device Guard relies on Windows hardening such as Secure Boot.


What do IT administrator skills mean now?

The role of the IT administrator will definitely need to change as data centers hybridize across multiple types of private and public clouds, stacks of infrastructure converge and hyper-converge, and systems management develops sentience. Of course, change is inevitable. But how can old-school IT administrators stay current and continue providing mastery-level value to their organizations? I'd recommend paying attention to current trends and emerging capabilities. Become an expert in how the organization can best use those trends. ... The future of IT is about creating higher-level value individually while leveraging core expertise widely -- developing the deepest insights, but sharing it as widely as needed to get an optimized return on the IT investment that businesses make.


IBM says: ‘Swift is now ready for the enterprise’

With Swift on the Cloud, enterprises will benefit from faster back-end API performance, safer and more reliable transaction and integration support, and the ability to re-purpose Swift developer skills on the client and server-side. This integration delivers tangible benefits to enterprise IT.City Furniture was building an app to handle clearance furniture. They had intended building their front end apps in Swift, but were able to work with early versions of the tools IBM introduced today to build the back end code in the same language. “They were able to build that in an incredibly short time, a few weeks,” he said. City Furniture is a perfect example of the kind of small, nimble development teams that will underpin the future of enterprise IT. “They had one developer and we helped them a bit. That one developer was also able to contribute to the project


9 Ways To Ensure Cloud Security

Whether you’ve migrated some or all of your infrastructure to the cloud, or are still considering the move, you should be thinking about security. Too often, organizations assume a certain level of protection from a cloud service provider and don’t take steps to ensure applications and data are just as safe as those housed in the data center. The sheer range of cloud technology has generated an array of new security challenges. From reconciling security policies across hybrid environments to keeping a wary eye on cloud co-tenants, there is no shortage of concerns. An increasingly complex attack landscape only complicates matters and requires security systems that are vigilant and able to adapt. Here are nine tips to consider before, during, and after a cloud migration to stay ahead of the curve when evaluating security solutions for your cloud service.


Cyber Security Threat Detection – The Case for Automation

The good news is that advances in threat detection technology have significantly improved the enterprise’s ability to detect and stop these threats and prevent extensive damage. The challenge, however, is that many of these technologies demand an army of human security analysts to interpret threat indicators and determine the appropriate course of action, including elimination and clean up. With hundreds, if not thousands, of varying levels of threat flags per day, this task is like holding back the tide; it is nearly impossible for security teams to keep up with the flow of information and still perform other ongoing responsibilities in prevention and analysis. Not surprisingly given their frequency, many of these alerts are often ignored.


Taking Risks To Manage Risk: The Life Of The Modern IT Security Executive

Risk isn’t something that many IT security professionals are comfortable with. After all, they’re often employed to reduce the risk of attacks on corporate IT. ... Doing things differently often comes with the risk of failure, which can have negative consequences to a company’s IT security. But the IT security space is dynamic; new technologies, solutions and strategies come out regularly and CISOs need to keep pace with these developments. “The biggest risk at the moment is doing nothing — you’re at risk of becoming irrelevant,” CSIRO CISO and lead architect Angus Vickery said at SINET61. “You have to do something to ensure you’re continually relevant because the horse will bolt without you anyway. “… Modern CISOs need to have an open mind.”


Security framework released for industrial Internet of Things

The security framework goes along with reference architecture, connectivity and other guides previously published by the consortium. This document separates security evaluation into endpoint, communications, monitoring and configuration building blocks, each with implementation best practices. It also breaks the industrial space down into three roles: Component builders (who build hardware and software), system builders (better known to readers here a system integrators) and operational users. To ensure end-to-end security, the consortium notes industrial users must assess the level of trustworthiness of a complete system. As for the future, the concluding note in the framework points out that as the sheer volume of data required for managing devices increases, there’s a point where centralized security management ceases to be effective and efficient.


Five Strategies For Creating a Culture of Data Security

When data protection is prioritized and done well, it provides more disciplined operations, increased customer and stakeholder trust, and minimized risk. One of the best ways to protect company information is to create a corporate culture that views information security as a shared responsibility among all employees. This can be done by implementing regular and comprehensive training programs for all employees on the right way to manage, store and destroy physical and digital data. ... Experts suggest that employees may forget 50 percent of training information within one hour of a presentation, 70 percent within 24 hours and an average of 90 percent within a week. When you consider this, it is clear that training once a year or on an ad-hoc basis is insufficient to ensure valuable customer, employee and business data is being protected.




Quote for the day:


"Relative to all the other risks companies face, the cyber risks often aren't as big a deal as we think. It may be bad for you if you are the victim, but it doesn't change the behavior or strategy of a company." -- Sasha Romanosky


September 21, 2016

Five Social Engineering Scams Employees Still Fall For

“Most people are not going to look really closely to know where that email came from, and they click on it and their machine may be taken over by somebody, or infected,” says Ronald Nutter, online security expert and author of The Hackers Are Coming, How to Safely Surf the Internet. “Especially when you’re exchanging files with subcontractors or partners on a project, you really should be using a secure file transfer system so you know where the file came from and that it’s been vetted.” He also cautions recipients to be wary of any file that asks the user to enable macros, which can lead to a system takeover.


How flexible should your infosec model be?

How often to adopt infosec policy changes is a conundrum. Companies need to come up with a way to remain flexible, to ensure that their policies and procedures reflect the current threat landscape, yet they can't hand down so many new rules and restrictions that they frustrate users and inadvertently compel them to consider bypassing corporate rules, explains Kelley Mak, an analyst at Forrester Research. At the same time, companies have to strike a balance between using firefighting tactics to address the most current threats and treating information security policy as a holistic strategy, Mak says. "It's not as simple as taking the data and making a new policy, because you have to make sure information workers aren't upset," he says. "The more restrictions you put in place, the more likely someone is to go around it."


Cybercrime Inc: How hacking gangs are modeling themselves on big business

Like the legitimate software market, cybercrime is now a huge economy in its own right, with people with a range of skillsets working together towards one goal: making money with illicit hacking schemes, malware, ransomware, and more. It's essentially an extension of 'real world' crime into cyberspace, and it's come a long way in recent years as groups have become bigger, more specialized, and more professional. "There's been a substantial amount of improvement and innovation in the way attackers go after networks and, as cybercrime has professionalized, you've seen individuals develop a particular set of skills which fit into a broader network," says Gleicher, now head of cybersecurity strategy at Illumio.


Picking up the pace: The intersection of strategy and agility

Organizational agility, not to be confused with the Agile methodology, is the ability to quickly identify and execute initiatives for opportunities and risks that align with overall strategy. This means that organizations have not only to stay aware of changes in their business environments, but also to be flexible enough to change direction and implement new initiatives quickly, both in order to avoid risks and to achieve competitive advantages. APQC and Strategic and Competitive Intelligence Professionals (SCIP) conducted a survey to look at organizational agility and understand what role strategy has in helping organizations be more agile. To that end, the survey investigated organizations’ agility, strategic planning, information assessment, and implementation practices.


Roundtable: What Experts Are Doing to Protect Against Ransomware

What’s different is that your user population needs to know what to do if a ransom message appears on their screen. Do they power off, disconnect from the network or do both? Your user community has to know exactly what to do. By the way, the right answer is to disconnect from the network and not power off—rely instead on whatever mechanism you have to trigger an incident response. Do not power off. So the users have to know that. Assuming that you have the basic hygiene—the incident response plans, the remediation, the patching, the hardening, the configurations—in place, then the only other additional consideration is that if you don’t have a fast, automatic way of detecting and responding to zero-day malware—either at the network level or at the end point level—you need to get one.


The Internet of Things, cyber-security and the role of the CIO

Basically we are inexperienced in creating large platforms with security in mind. This inexperience in deploying mass networks in a secure way could create a recipe for major breaches and security issues. The IoT is a relatively greenfield area in IT. It should offer the chance to design and architect solutions with security integrated right from the start, rather than an additional feature further down the road. Whilst CIOs need to be mindful of this issue for future planning, there is also the opportunity to make sure vendors are building this security into any IT expenditure that the organisation plans to make. Existing security controls may well be able to address these new concerns but they need to be implemented in an agile and effective way to enable them to adapt to the new attack vectors.


Navigating The Muddy Waters Of Enterprise Infosec

Many companies today hope to avoid similar high-profile wakeup calls. After years of news about disastrous breaches, information security has finally gotten the attention of upper management. Two-thirds of 287 U.S. respondents to a survey conducted by CSO, CIO and Computerworld said that senior business executives at their organizations are focusing more attention on infosec than they were in the past. And most of the respondents said they expect that focus to continue. Yet IT leaders still face challenges when it comes to aligning security goals with the needs of business, including justifying costs, defining risks, and clarifying roles and responsibilities.


ArtificiaI intelligence, APIs and the transformation of computer science

Like yesterday’s code libraries, you could try to build A.I. platforms yourself -- if you had a few years and a dozen data scientists to throw at the problem. Or you can access A.I. engines like IBM’s Watson or Google’s TensorFlow “as-a-service,” taking advantage of the planet’s most advanced, fundamental CS work via an API call. When one looks at the world of software in this way, the choice for most companies today is straightforward: spend years of effort and millions of dollars in expense duplicating extremely important -- but ultimately commodity, especially once it’s open-sourced -- computer science work, or instead focus on leveraging that work to develop and improve their own products and intellectual property. For most businesses, the choice is simple.


In a world of free operating systems, can Windows 10 survive?

We can all pretty much agree that Windows has some staying power. That said, when I asked our resident Windows soothsayer Ed Bott about actual numbers of users, he told me, "Given that PC sales are flat or down in recent years and are probably close to the replacement rate, it's likely that the very large Windows installed base is shrinking slowly." The operative word here isn't "shrinking," it's "slowly." There are millions of users out there who have good reason to stick with Windows. Many of them will continue using it because the learning curve for a different operating system is either too much work, or just simply unnecessary. Others will stay with it because Chromebooks, tablets, and other "appliance-like" machines just don't have enough power and flexibility.


How HR and IT departments can join forces to bolster security strategies

Working with IT, HR should establish processes to manage access rights to sensitive data – ensuring that appropriate controls are in place – and preventing employees from accessing data that they don’t need. HR can also support IT in identifying gaps in terms of departments or individuals, like contractors or temporary staff, with permissions that have not been withdrawn or privileges that may need to be re-defined. They can implement processes and technology for managing access rights and to ensure that these are regularly audited to close any security gaps.  Full co-operation between HR and IT is essential in projects of strategic importance such as IAM (Identity Access Management) deployments. This is a common pitfall, but without internal co-operation there can be misunderstandings, or at worst, projects can unravel entirely.



Quote for the day:


"Negativity will derail you from pursuing success, and like attracts like." -- Kathleen Elkins