Daily Tech Digest - November 02, 2016

Tech Bytes - Daily Digest: November 02, 2016

The biggest cyber security threat is right under our noses, Will digital economy create a developer shortage, What big data is doing for shipping on a global scale, Defending against insider data breaches, Bitcoin isn't anonymous enough to be a currency and more.

The Biggest Cybersecurity Threat is Right Under Our Noses

Technology is advancing at a rate where the convergence of progress in multiple areas is finally making it possible to detect malicious insiders. The cost of storing data continues to go down. The processing capabilities of servers to sift through data keeps marching forward. And advances in machine learning—artificial intelligence—makes it possible to make sense of the data in meaningful ways. It is this confluence of massive secure scalable computing at a low cost, combined with exponential algorithm advances, that has made a breakthrough AI cybersecurity solution like Cognetyx possible. Take one of the toughest scenarios as an example. Let’s say an employee of a hospital for whatever reason decides to steal patient data. Maybe they hold a grudge against their boss. Perhaps they are going to sell the data.


6 trends that will shape cloud computing in 2017

The global public cloud market will top $146 billion in 2017, up from just $87 billion in 2015 and is growing at a 22 percent compound annual growth rate. The lion’s share of this growth will come from Amazon.com, Microsoft, Google and IBM, which have emerged as "mega-cloud providers,” Bartoletti says. They are opening new data centers and making concessions, such as Microsoft’s agreement to have T-Systems manage its cloud in Germany to meet data localization requirements. But the big players won’t be able to service every unique request, which means smaller regional players will see an uptick in adoption in 2017. Bartoletti recommends: "Keep you options open and don't be afraid to use multiple providers."


Will Digital Economy Create A Developer Shortage?

According to Sam Ramji, CEO of the Cloud Foundry Foundation, the companies that don't see a gap are not those furthest behind the effort to move into the digital economy. They are the ones that have been functioning as part of it for several years (companies such as Amazon, eBay, Google, Apple, and Netflix) and are attracting talent because of their position in the economy. The gap shows up more clearly in companies that are still dominated by their legacy systems, he noted. He said his impression that this might be the case was confirmed when Netflix made some of its internal code for managing AWS cloud operationsavailable as open source. When he asked Netflix officials why they released home-grown code, they explained that it would make knowledge of what they were doing more widespread.


What Big Data is Doing for Shipping on a Global Scale

Another important way that the use of Big Data can help optimize the shipping process on a global scale is to provide the information necessary to help shipping companies better manage multi-stop routing, which is a nightmare for any industry, and the shipping industry is no different. The use of Big Data will allow shipping companies to use a mathematical approach to determine where shipping containers should be placed on the ship. By using data to effectively place containers where they can be reached at the proper time, the entire process can be streamlined to run more effectively and efficiently — not only making the company more productive, but also saving the shipping company money.


ALM and DevOps tooling still a critical part of orchestration

With continuous development and DevOps integration, the DevOps models or recipes associated with each ALM phase have to be designed following the ALM processes. Then it can be codified in DevOps language -- declarative or imperative, as appropriate. When development changes are made to an application, component or service, the changes not only have to be tested in terms of application functionality, security and compliance, but also in how they impact the integration between ALM and DevOps. The tight coupling of development, ALM and DevOps demanded by continuous delivery has changed DevOps already. The two most popular tools, the imperative Chef and declarative Puppet, have both evolved to support modular declarations of resources.


Defending against insider data breaches

Internal data breaches have the potential to damage reputation and incur significant financial loss – not only for law firms but clients too. As highlighted by the Panama Papers, the impact of an insider cannot be underestimated. An anonymous source from within Panamanian law firm Mossack Fonseca was able to leak an unprecedented 11.5 million documents over the course of a year, with consequences that reverberated across the globe. Of course, this is an extreme example, but it does serve to highlight the danger posed by an insider who can go undetected for long periods of time.  While there is no silver bullet, there are steps that every law firm can take to reduce the risk of internal data leakage – and these aren’t constrained to the IT department.


Bitcoin Isn’t Anonymous Enough to Be a Currency

On the surface, privacy-preserving cryptocurrencies seem designed precisely to undermine such controls. Monero mixes multiple transactions together so that a source cannot be directly linked to a destination. Zcash creates shielded transactions where everything is hidden except for a string of data that proves the transaction is valid . Bitcoin also plans to add some of these features in the near future. As bad as it looks, though, developers aren’t creating anonymous payment systems because they want to help criminals evade the law. They're doing it because that’s the only way a decentralized currency can work. If, say, users have to evaluate the acceptability of each bitcoin based on its transaction history, then one coin can be worth more than another and the currency loses its reason for existence.


How Advanced Technology Can Save Us From Future Internet Shutdowns

“One self-help mechanism would be for a ‘good’ hacker to write a virus that finds insecure devices and simply disables them. This would remove insecure devices from the pool of computers that could be used as bots,” Eli Dourado, technology policy director at the Mercatus Center, told TheDCNF. “It would be inconvenient to consumers whose devices suddenly stopped working, but that inconvenience may be necessary to prevent more serious attacks in the future.” There are also security network services available, like Cloudfare and Akamai, but they can be expensive, said Ryan Hagemann, technology and civil liberties policy analyst at the Niskanen Center. “As with any decision, a company or individual will need to assess whether the benefits of employing such a service outweigh the costs.”


How To Find The Best Wi-Fi Router For A Home Office

Before you rush out to buy an expensive Wi-Fi router with MIMO, you should know that to utilize that speedy wireless your Wi-Fi devices must also support the tech. Unfortunately, the majority of today's Wi-Fi devices, including smartphones and tablets, only support one or two spatial streams, and they won't be able to take full advantage of Wi-Fi routers with more streams. The same thing applies to MU-MIMO routers, because only a handful of mobile devices available today support the tech. In some cases, it may make sense to buy a more affordable Wi-Fi router that delivers optimal performance with your existing devices, and then later opt for a more advanced router when you upgrade your mobile devices to phones, tablet or computers that support MIMO.


Cisco says it'll make IoT safe because it owns the network

Within the next year, Cisco will launch a program to certify IoT devices as compatible with its network-based software. Among other things, the software should be able to automatically authorize these devices on a “white-list” basis, allowing only endpoints that are safe instead of trying to find and block those that are not. Devices themselves will play a role here, telling the network what kinds of things they should be able to do, such as only connecting to the home server for the service it provides. This approach might help to prevent devastating events like the recent Mirai botnet attack that employed thousands of insecure internet-connected cameras. But the IoT onboarding and management capabilities go beyond security to include automation of other tasks like network configuration that administrators would otherwise have to do.



Quote for the day:


"The aim of education should be to teach the child to think, not what to think." -- Indira Gandhi


Daily Tech Digest - November 01, 2016

Tech Bytes - Daily Digest: November 01, 2016

Devops engineer skills needed for continous deployment, Digital radically disrupts HR, Microservices governance requires standards security & scrutiny, Why don't all businesses have a good continuity strategy, Red Hat EMEA chief sees opportunities in shifting markets and more.

17 essential tools to protect your online identity, privacy

Most users know the basics of computer privacy and safety when using the internet, including running HTTPS and two-factor authentication whenever possible, and checking haveibeenpwned.com to verify whether their email addresses or user names and passwords have been compromised by a known attack. But these days, computer users should go well beyond tightening their social media account settings. The security elite run a variety of programs, tools, and specialized hardware to ensure their privacy and security is as strong as it can be. Here, we take a look at this set of tools, beginning with those that provide the broadest security coverage down to each specific application for a particular purpose. Use any, or all, of these tools to protect your privacy and have the best computer security possible.


DevOps engineer skills needed for continuous deployment

Speed and fluidity are the hallmarks of a DevOps culture -- code is always changing, and it takes sound collaboration and version management skills to assemble the correct components and craft a release that runs. DevOps engineers work with tools such as Git, Perforce and Apache Subversion for version and revision control. To better deploy this ever-changing code, many DevOps engineers embrace configuration management, which is almost always automated to accelerate the pace of new version releases. Many DevOps engineers are experts with tools such as Puppet, Chef and Vagrant. DevOps engineers don't just shepherd code through development; they also provide the bridge needed to facilitate those new releases on the operations side


Digital Radically Disrupts HR

Technology advances are enabling HR to put the “human” back into human resources, and helping give people management back to the people. This could include involving employees and managers in high-impact talent processes—including recruiting, hiring, succession planning, learning and shaping career paths. ... Just as digital changed marketing by enabling customization of products and messages, digital is similarly transforming HR. Digital can now be used to push out customized offerings, including learning and job opportunities, targeted, personalized messages, or personalized information based on an analysis of an individual’s social media digital trail and artificial intelligence that predict what an individual needs and values based on their unique employee segment.


Data Science Predictions for 2017

Data is now creating opportunities for business growth and profit like never before. In the last decade, the emergence of advanced data technologies and superior analytics tools has made it possible for business operators to reap numerous benefits from their data assets, yet for most they’ve only just scratched the surface of data’s potential. Data Science is allowing enterprise’s to successfully leverage that potential like never before. A particular McKinsey report published in 2013 predicted that the global business community would feel the pinch of an acute shortage of Data Science professionals for the next decade, specifically a shortage of “1.5 million analysts” skilled at deriving competitive intelligence from the vast amounts of static and dynamic (real-time) data.


Microservices governance requires standards, security and scrutiny

"It is important to look at governance holistically as not only microservices management during runtime, but also as an inculcation of best behavior within domain teams during design and development," Kohli said. While the first part can be addressed through APIs, best practices can be more difficult since they deal with the human element. Things like posting microservices on a collaboration hub and encouraging merit-based reuse with reviews and ratings can help, he said. Ultimately, the popularity of microservices will require standards, which will likely stem from collaboration between companies in the cloud computing space. Until then, products do exist to help shore up security issues and ensure that the microservices are flexible enough to meet the needs of the company.


The Shifting Cyber Attack Target Set And Why It Matters To The Mid-Sized Business

What is troubling about this is that smaller firms have a much harder time recovering from attack. Breach requires cleanup, forensics, notification of employees, customers, clients, causes costly damage to brand, may diminish goodwill, and can result in direct financial loss. A smaller firm can have a very hard time recovering. In fact, according to surveys by the National Cyber Security Alliance, approximately 60% of small businesses that fall victim to a cybercrime each year go out of business six months after an attack. Digital life is unfair. The big guys like Target, Home Depot, eBay all recovered from massive cyber attacks with no noticeable impact on share price a year after their attacks. But mid-sized businesses may well be driven to bankruptcy.


Why don’t all businesses have a good continuity strategy?

Most businesses are familiar with the idea of data backup, but a proper disaster recovery strategy goes beyond data. Businesses often have data backed up, but don’t consider the systems that rely on that data. What use is data if a disaster renders the IT infrastructure inaccessible? While data backup is essential, it serves little purpose when all of your applications and systems are out of commission. Disaster recovery is usually a manual process, in which IT teams are on-call and recovery time is dependent upon how quickly they can restore service. A more effective continuity strategy takes the full implications of downtime into account. Downtime means a hit to the bottom line. It means employees getting paid to wait for crucial systems to come back online. It means your customers going elsewhere.


Here's How Businesses Can Prevent Point-of-Sale Attacks

Typically, point-of-sale malware works by reading payment data the moment the card is swiped through the retail checkout machine. It does this by scraping the RAM memory of the point-of-sale terminal, where the payment data can be unencrypted. "The malware techniques are evolving all the time," Rice said. Criminals also understand that retailers are continually updating their point-of-sale machines for pricing or inventory reasons. "So they (the hackers) are using a variety of vulnerabilities to insert the malware into the system," he added. However, businesses are far less vulnerable to any data breach if they move to end-to-end encryption, according to Rice. That means encrypting the customer's data throughout the entire payment process, including the moment the credit card is swiped.


Zcash, a Harder-to-Trace Virtual Currency, Generates Price Frenzy

The privacy features of Zcash could make it harder for the currency to win support from regulators and bankers. Investigators have used Bitcoin’s ledger, known as the blockchain, to track down some people selling drugs for Bitcoins on black market websites. Such websites have proliferated since the first popular black market site, the Silk Road, was taken down in late 2013. Since the demise of the Silk Road, mainstream financial institutions have shown significant interest in virtual currencies and particularly in the blockchain technology, which provides a new decentralized way to keep financial records and to power transactions of all sorts. Major central bankshave recently been talking about using the technology for their own currencies.


Red Hat EMEA chief sees opportunities in shifting markets

OpenShift is key here and Knoblich nominates it as “becoming the hottest product for Red Hat, not necessarily for revenue but in terms of interest, proofs of concept and net new customers. Here, Knoblich sees Linux becoming the common denominator underlying physical servers, virtual servers, private clouds and public clouds. There is a different buying audience for OpenShift where DevOps is at the heart of activity and it is one that the company is quite comfortable. A related opportunity lies in JBoss middleware where Knoblich says some firms are swapping out BEA WebLogic for JBoss. In telecoms Red Hat is helping carriers virtualise their networks having created a unit that is focused purely on telco, and banking is another market where restructuring of the sector will lead to a requirement for agility.



Quote for the day:

"The basic story is that we have been gradually losing our privacy in a whole bunch of ways that people don't appreciate." -- Matthew Green

Daily Tech Digest - October 31. 2016

Tech Bytes - Daily Digest: October 31, 2016

Algorithms built on digital footprints will revolutionize lending, Will the fourth industrial revolution have a human heart, Why big data leaders must worry about IoT security, 5 mistakes to avoid when building the business case for IT, Financial sector urged to strengthen governance and more.

Algorithms built on digital footprints will revolutionize lending

Fintech will also be very influential in the area of payment banks. One of the reasons for the slow growth of bank lending is that there are very few banks that have licenses. The deposit rate and lending rate is quite high so there’s a lot of sticky money in the economy. Until now, there was no competitive dynamics that upset the apple cart. Now with the emergence of fintech firms on the credit side and the rise of payment banks on the deposit side, the established banks will face pressure on the credit and deposit side. This will also mean that the customer will benefit both ways, from a higher deposit rate and with lower interest rates on loans and at the same time with better access to both. So, fintech will change the nature of existing businesses in areas of transaction, credit, and deposits.


5 Lessons We Learned on Our Way to Centralized Authentication

In many startups, centralized authentication is a "future us" problem. Setting up centralized auth is useful for managing your network, but requires time, domain knowledge, and patience to get many of the technical solutions working. Compare this with the ease of user management via configuration management (CM) tools that your DevOps teams are already using — they work well enough (and did we mention that they're already in place?) — so it makes total sense that many organizations "punt" on this issue. However, once your organization grows to a certain size, managing users through CM can be a hassle. For one thing, not all systems are going to rely on UNIX authentication (such as Jenkins, Grafana, etc.), so you’ll need to start configuring those separately and possibly outside of your CM platform.


Will the Fourth Industrial Revolution Have a Human Heart?

In this new industrial revolution, it is believed that robots and humans could be living and working together a lot more. This raises questions of trust. A good example is if a person is faced with an illness and a robot and human doctor prescribe different drugs and care strategies to get well. It would be hard to know who to trust. Another great example provided is if you were arrested for a crime that you didn’t commit, would you rather get tried by a robot or a human judge. These are questions we may face and sooner than you think. In fact, in some cases it is already happening. Some believe that there could end up being conflict between people and robots. This could have two potential outcomes. One would be and economic struggle where humanity is destroyed at its core.


Getting data privacy and security right is 'paramount' to success of open banking, says regulator

"To ensure that enough time is available to work through the important details of this remedy, particularly those that ensure that customers’ data is secure at all times, we are requiring that the release of information under this remedy takes place in stages," Smith said. "The least sensitive information – for example about banks’ prices, terms and conditions and branch location – will be made available by the end of March 2017. We expect that all aspects of an open banking standard will be up and running in early 2018 to coincide with the implementation of the second Payment Systems Directive (PSD2)." Smith described the CMA's open banking plans as "the most fundamental" of its remedies from its market review and said open APIs have the potential to "transform the financial services sector".


An absolute beginner’s guide to machine learning, deep learning, and AI

Here’s a simplistic breakdown: a neural network consists of several layers of neurons. Inputs are passed into the first layer. Individual neurons receive the inputs, give each of them a weightage, and produce an output based on the weightages. The outputs from the first layer are then passed into the second layer to be processed, and so on. The final output is produced. Then the magic happens. Whoever runs the network defines what the “correct” final output should be. Each time data is passed through the network, the end result is compared with the “correct” one, and tweaks are made to the weightages until it creates the correct final output each time. The network, in effect, trains itself. This artificial brain can learn how to identify chairs from photos, for example. Over time, it’ll learn what the characteristics of chairs are, and increase its probability of identifying them.


Why big data leaders must worry about IoT security

One problem facing companies that use or are planning to use IoT with their big data plans is that there currently is no consensus on how to implement security in IoT on a device. This lack of consensus is an issue for standards committees to resolve, not for corporate IT to address. So what do you do if your company is using or planning to use IoT? Follow these steps. First, identify all of your IoT exposure points for hacks and breaches, and write and enact a plan for regularly monitoring them. This monitoring should occur at two levels: regular physical inspections of devices and continuous software-based monitoring and logging of emissions from these devices that are conducted by a network-based system. If unusual activity from a device is detected at any time, there should a way to immediately shut down that device.


5 mistakes to avoid when building the business case for IT

Now, more than ever before, companies are looking closely at the impact of IT spending on their bottom line. Economic pressures, coupled with years of heavy IT spending without clear returns, have driven corporate demands for a tighter rein on IT expenditures and clear justification of every dollar being spent. Technology and finance decision makers need metrics and measures they can trust to ensure that they are making IT decisions that will have a positive impact on the corporate bottom line. Although the buzzwords may have changed and the expectations for payback and risk have become more precise, the path to a credible business case hasn’t changed. Building a business case for a tech investment isn’t difficult, it’s just structured: identify the top areas of benefit, quantify the costs and benefits, and calculate the metrics.


Financial sector urged to strengthen governance

According to Rwangombwa, banks can promote corporate governance by harnessing the relationship between management, shareholders and other stakeholders. He added that the structures through which a firm’s objectives are set, and the means of attaining those goals, and constant performance monitoring play a critical role in strengthening governance. “It is, therefore, important to ensure timely and accurate disclosure on all matters, including your financial health, performance, ownership and governance,” he said. He argued that financial institutions are unique and should uphold public trust to succeed. Rwangombwa noted that the concept of corporate governance is relatively new, adding that even some directors do not understand the ‘heavy’ responsibilities of a director.


Visa Taps Blockchain for Cross-Border Payment Plan

Visa and Chain’s system represents a brand new effort to challenge the Swift electronic messaging network because the dominant methodology for moving giant sums of cash across borders between banks on behalf of companies. Swift has been the topic of recent high-profile hacks and is beneath intensive restrictive scrutiny. But cross-border payments ar still a moneymaking business for banks. Visa, that is attempting to become a a lot of relevant different within the space, are providing the merchandise beginning next year to its member banks as a tool to supply their business customers. The California-based network operator is best famed for facultative personal line of credit and debit cards.


What Really Happens When You Run IT Like a Business?

As an Enterprise Architect working in the IT Management, you have the heavy task of aligning between different organisational silos as well as architectural framework, and industry standards and best practises. Whilst each existing framework and standard has its own intended points of focus, they all share the same restrictive principle i.e. they take a “toolbox” approach where the more content you have in your framework, the more value you provide to the architecture practitioners – but only in and of the particular framework and do not take into account the need to provide insight into how they connect to the broader environment. It is therefore difficult for practitioners to implement the frameworks, understand how to integrate between multiple frameworks or what to prioritize for the benefit of the organisation.



Quote for the day:


"If someone likes your idea the first time you explain it, your idea isn't risky enough." -- Nicolas Cole


October 30, 2016

Tech Bytes - Daily Digest: October 30, 2016

Shareholders sue companies for lying about cyber security, Can anyone keep us safe from a weaponized IoT, The top reason digital transformations fail, Experts share their cybersecurity horror stories, Important tips for updating your breach response plan, Actionable agile tools and more.

Shareholders Sue Companies For Lying About Cyber Security

Directors owe fiduciary duties to their shareholders and have an important role in overseeing corporate risk management, which is now understood to include cyber security risk. There are two ways that breaches can give rise to suits in this context. The first involves a board making an affirmative decision regarding cyber security that permitted a breach—say, putting a woefully inadequate security system in place, or just delegating the whole issue to IT. A second factual scenario would involve the failure to take any precautions at all. Because it established that a board has a duty with respect to cybersecurity, doing nothing about risk would land you in trouble.


Can anyone keep us safe from a weaponized ‘Internet of Things?’

The big problem is that too many of those connected products come with lax security features that make them juicy targets for hackers, according to Herzberg. For instance, cheap Internet of Things devices are often secured with default passwords and may lack support for security updates. And the rapid expansion of the Internet of Things market means even more vulnerable devices are likely to be in use soon: By 2020, there will be over 20 billion Internet of Things devices online, according to one estimate from analysis firm Gartner. ... “It would be great if we could say, 'If you want to produce a device connected to the Internet you must go through basic security checks,’ but we don’t have that right now,” he said.


Synchronous vs. asynchronous communication: The differences

A key challenge in asynchronous execution is ensuring that the clocks of all participants and constituent components or modules remain synchronized. For human interaction, such as a live chat session, such skew is not important. However, in synchronous execution, read-and-write storage operations are likely to occur milliseconds (or less) apart, so proper clock synchronization is essential in guaranteeing that I/O operations occur in the correct order. Another challenge is the need to correlate multiple data streams that encompass both synchronous and asynchronous collection methods. Especially acute in the area of data mining and streaming analytics, dealing with this issue through the technique of singular value decomposition was examined in research first published in 2002.


Why and How to Test Logging

We should not spend time testing the logging subsystem itself, such as log4net, log4j, etc.; we should assume that the mechanics of logging (writing to disk, rotating log files, flushing buffers, etc.) are already handled. Instead, we should concentrate on ensuring three separate but related things ... Of course, by checking for these things, we exercise the logging subsystem and implicitly test that too. By addressing logging as a testable system component, we also tend to reduce the ‘time-to-detect’ for problems, increase team engagement, enhance collaboration, and increase software operability. We need to define a set of event type IDs that correspond to useful and interesting actions or execution points in our software. Exactly how many of these IDs you use depends on your software


The Top Reason Digital Transformations Fail

Despite awareness of the importance of digital technology and business models, we continue to see that most leaders don’t know how to lead a digital transformation. Many work to enable others in their organizations, but this often results disjointed, independent, tactical initiatives, which are costly and go nowhere, creating bad blood inside and outside the organization. To be successful, digital platforms need to be unified across the organization, spanning every division, product, service and supplier. Doing this takes real leadership and board support. That doesn’t mean that it always looks the same, though. If you look at how big players are approaching digital transformation, you can see different approaches playing out.


Apollo Hospitals uses big data analytics to control Hospital Acquired Infections

The process involved understanding the various infection patterns that affect an inpatient. Since this is a multi- clinical disciplinary activity, such a project entails the involvement of the microbiologists, lab teams, doctors from various clinical specialties and pharmacologist. “These stakeholders play a key role in promoting appropriate practice for prevention of such infections. So we wanted to equip these multiple stakeholders with powerful big data analytics to enhance their ability to define both preventive as well as prescriptive treatment patterns and ensure that the patient’s well-being is maintained,” states Sivaramakrishnan. The hospital took all the diagnostics results, patient conditions, other relevant clinical information and created analytics models out of it.


Experts share their cybersecurity horror stories

Cybersecurity experts warn that large-scale, coordinated cyber-strikes targeted at essential infrastructure, like last week's Dyn DDoS attack, could cost the economy billions of dollars in lost productivity and potentially harm individuals. ... When companies are attacked, TechRepublic ordinarily advises them to follow damage-mitigation best practices. In the spirit of Halloween, however, let your fears run wild with these hacking horror stories. ... Car hacking has been demonstrated. Shutting down power to a hospital can threaten lives. Network-connected healthcare devices can be misused. IoT is a new frontier with new risks - the things we're putting on the internet range from convenience devices for comfort and lighting to life-sustaining devices like pacemakers and other medical implants.


The Internet of Things Industry Failed Us

The most frustrating part of the recent DDoS attack is that IoT manufacturers only needed to look at 30 years of consumer technology to see the proverbial writing on the wall. And if they couldn't do that, they could have heeded the warnings spouted by security researchers (corporate and hobbyist hacker alike). These people have told anyone who would listen how putting billions more devices on the Internet without careful consideration of how they will be used is a bad idea. In 2014, Dan Geer opened the Black Hat conference by saying that the IoT is already upon us and could lead to trouble.


Important Tips for Updating Your Breach Response Plan

The "set it and forget it" approach may be great for a thermostat, but breach response plans should never be left on autopilot. Modern hackers are often highly educated with extensive experience and top-notch skills. Furthermore, many hackers work for their governments or corporations, giving them access to the latest technologies. Hackers have become increasingly adept at finding vulnerabilities that they can exploit, the Heartbleed vulnerability being just one example. Given that payouts are huge, cyber-criminals are extremely persistent at finding a way into secure networks. With the growing threat level, increasing regulations, evolving technologies and changing motives, it has become increasingly important to update breach response plans frequently.


Actionable Agile Tools

Do you often hear things like “That is a typical (insert person's name here) job” or “only (insert person's name here) knows about (insert subsystem or component name here)”? This is an all too common issue in IT companies, and is a seriously dangerous situation to be in. Especially in the modern age where people do not stay at companies for long durations of time anymore. All companies know this and all talk about how they need to start doing something about it. But very few ever actually do until that person finally announces that they are leaving, then they need to make do with a brief handover period and muddle through without them until the next person becomes an expert in that area and we repeat this process all over again.



Quote for the day:


"Present solutions. Minimize waste. Engage willingly." -- S. Chris Edmonds

October 29, 2016

Tech Bytes - Daily Digest: October 29, 2016

Former NSA leader talks Snowden & the future of infosec, How big data can improve student performance and learning approaches, What is data quality & how do you measure it for best results, How economists view the rise of Artificial Intelligence, Companies complacent about data breach preparedness, How much does a data breach cost and more.

19 psychological tricks that will help you ace a job interview

So if the hiring manager offers you some flexibility in choosing an interview time, ask if you could come in around 10:30 a.m. on a Tuesday. That's likely when your interviewer is relatively relaxed. In general, you should avoid early-morning meetings because your interviewer may still be preoccupied with everything she needs to get done that day. You'll also want to avoid being the last meeting of the workday, as your interviewer may already be thinking about what they need to accomplish at home. ... Twenty-three percent of interviewers recommended wearing blue, which suggests that the candidate is a team player, while 15% recommended black, which suggests leadership potential. Meanwhile, 25% said orange is the worst color to wear, and suggests that the candidate is unprofessional.


Former NSA leader talks Snowden and the future of infosec

When it was put to Inglis that Snowden might be viewed as a whistle-blower acting with the intent to take a stand on the right of citizens to data privacy, Inglis said: “I don’t think he thought that. Whistle-blowers should be formally supported and within the US system they are. You have the right and authority to take [your concerns] to some other places … Snowden did none of that – he made no complaints to anyone ... [He] recklessly released information that had nothing to do with the protection of privacy.” Snowden helped to “fill the vacuum of information [about how the NSA works],” he said, and “a lot of the cost was a vilification of the NSA”.


Uber’s New Goal: Flying Cars in Less Than a Decade

In fact, Uber reckons that the technology for these kinds of vehicles will mature within five years. Google cofounder Larry Page seems to agree: earlier this year he invested in two flying-car companies. But there are still some significant wrinkles that need to be ironed out before that happens, which make the five-year time frame seem overly optimistic. To be fair, Uber realizes there are hurdles. In its white paper, Uber lists a number of issues it’s worried about (deep breath): battery technology, vehicle efficiency, vehicle performance and reliability, cost and affordability, safety, aircraft noise, emissions, takeoff and landing infrastructure, pilot training, air traffic control, and the certification process.


How Big Data Can Improve Student Performance And Learning Approaches

The data-based approach periodically tracks an individual student’s performance by using indicators such as: prior knowledge, level of academic ability, and individual interests. What this approach achieves is that it allows for personalized learning where the students can actively learn at their own pace. Furthermore, educators can provide their support, tools, and assistance to those students who need their attention in the classroom. One of the highly regarded platforms that features personalized courses and exercises is Khan Academy. Aside from the fact that it can be used by students and parents, this platform also allows teachers to provide individualized video tutorials and practices in many different subjects, predominately math. Specifically, teachers can modify tutorials and playlists and recommend certain videos and exercises to students.


China’s Baidu to open-source its deep learning AI platform

The announcement follows the open-sourcing in the last two years of other machine intelligence and deep learning tools such as Torch and machine-vision technology from Facebook, TensorFlow from Google, Computation Network Tool Kit (CNTK) from Microsoft and DSSTNE from Amazon.com, as well as independent open source frameworks such as Caffe. Baidu also has open-sourced other pieces of its AI code. But Xu Wei, the Baidu distinguished scientist who led PaddlePaddle’s development, said this software is intended for broader use even by programmers who aren’t experts in deep learning, which involves painstaking training of software models. “You don’t need to be an expert to quickly apply this to your project,” Xu said in an interview. “You don’t worry about writing math formulas or how to handle data tasks.”


What is Data Quality and How Do You Measure It for Best Results?

What do we do when we find errors or issues? Typically, you can do one of four things: Accept the Error – If it falls within an acceptable standard (i.e. Main Street instead of Main St) you can decide to accept it and move on to the next entry; Reject the Error – Sometimes, particularly with data imports, the information is so severely damaged or incorrect that it would be better to simply delete the entry altogether than try to correct it; Correct the Error – Misspellings of customer names are a common error that can easily be corrected. If there are variations on a name, you can set one as the “Master” and keep the data consolidated and correct across all the databases; and Create a Default Value – If you don’t know the value, it can be better to have something there than nothing at all.


How Economists View the Rise of Artificial Intelligence

“Economists think of technology as drops in the cost of particular things,” Agrawal said. Likewise, the advent of calculators or rudimentary computers lowered the cost for people to perform basic arithmetic, which aided workers at the census bureau who previously slaved away for hours manually crunching data without the help of those tools. Similarly, with the rise of digital cameras, improvements in software and hardware helped manufacturers run better internal calculations within the device that could help users capture and improve their digital photos. Researchers essentially applied calculations to the old-school field of photography, something previous generations probably never believed would be touched by math, he explained.


Companies complacent about data breach preparedness

even as organizations are paying more attention to data breach preparedness, most aren't giving it the attention needed to execute their plans successfully when the time comes. Ponemon found that 38 percent of organizations have no set time period for reviewing and updating their plan and 29 percent have not reviewed or updated their plan since it was first put in place. Only 27 percent of organizations surveyed felt confident in their ability to minimize the financial and reputational consequences of a breach, and 31 percent lacked confidence in dealing with an international incident. For instance, in April, Symantec released its 2016 Internet Security Threat Report, which found that ransomware increased by 35 percent in 2015.


Protection is dead. Long live detection.

All too often, detection is an afterthought. A lot of planning and money go toward hardening protections, and then an intrusion detection system or a security information and event monitoring system is tacked on. It’s not enough. Detection strategy and architecture have to be the equal of protection strategy and architecture. If most organizations were already treating protection and detection equally, attackers would not be spending an average of 200 days inside target systems or networks before being detected. More than six months is plenty of time for adversaries to fully achieve their goals, plus explore, define new goals and find new targets. Don’t misunderstand. As essential as detection is, it is not necessarily a fail-safe. But the sooner a breach is detected, the sooner you can mount a defense and stop adversaries from achieving their goal, or at least minimize the damage.


How much does a data breach actually cost?

Knowing these numbers gives one a sense of how to measure their relative size. Because when it comes to measuring the cost of a data breach, size matters. It’s intuitive and true—the more records lost, the higher the cost. According to the same Ponemon study, the average cost of a data breach involving fewer than 10,000 records was nearly $5 million, while a breach of more than 50,000 records had an average cost of $13 million. Reviewing the numbers, it’s clear data breaches are a real and growing financial threat to businesses. The good news is it is a cost that can be avoided with a proactive investment in cybersecurity measures. Knowing the potential and average cost also gives business owners an idea of how much to budget to secure their information.



Quote for the day:


"How we think shows through in how we act. Attitudes are mirrors of the mind. They reflect thinking." -- David Joseph Schwartz


October 28, 2016

Tech Bytes - Daily Digest: October 28, 2016

Businesses shouldn't let security scares put them off IoT, Align intelligence UX & data to power exceptional customer moments, Focus your agile retrospectives on the learnings, IoT growing faster than the ability to defend it, The limits of encryption, Using smart city technology to power local economic development and more.

Businesses shouldn’t let security scares put them off IoT

Often, the hardware involved has more in common with a PC than with a cheap and cheerful consumer device, such as Dell’s Edge Gateway products, for example. The upshot of this is that they can be managed by the IT department using similar admin tools to the rest of the IT infrastructure, and also support many of the same security and monitoring tools. This is not to say that enterprises should be complacent about security, but that there other things that should be of greater concern than worries about an IoT deployment introducing new security vulnerabilities to the corporate network. If anything, last week’s attack should have been a wake-up call to how exposed businesses might become if they rely heavily on internet-based services such as those delivered from public clouds.


Align intelligence, UX and data to power exceptional customer moments

Today, our most advanced applications are intelligent. Look no further than IBM Watson or Salesforce Einstein A.I. Bluewolf's recent The State of Salesforce Report showed that over half of companies surveyed described their most essential applications as at least somewhat intelligent already, able to anticipate and either take or suggest the next action. Increasing investments in intelligent applications is one key element to driving business results, but that alone is not enough. Companies must also invest in their employee and customer experience, and focus on translating their overwhelming collections of data into intuitive, automated employee experiences that, in turn, can power incredible customer moments.


Focus your Agile Retrospectives on the Learnings

Agile Retrospectives are the cornerstones of any inspect and adapt cycle. Even though teams should not limit their learning to Agile Retrospectives, they are quite commonly the place where most of the learning happens. This is because they are a common place for data mining, whereby the team collects information about what happened during the sprint and is able to identify challenges. As a result of all of the learning that takes place during these sessions, teams arrange new ways of working in order to avoid default thinking patterns. During the last week when I worked with a Scrum Master and helped her with the Agile Retrospective, I realized something interesting. If we focus on the learning instead of the outcome, Agile Retrospectives will always be successful.


When IoTs Become BOTs, The Dark Side of Connectedness

The compromised IoT devices all appear to be built using the Swiss Army knife of Embedded Linux, BusyBox, and as such might not be readily patchable. Most of these IoT devices are webcams, smart DVRs, and home routers, but they are just the tip of the 1.2 million device iceberg that is the Mirai Botnet. To put this number in perspective the current active duty strength of the US Armed Forces is nearly the same number, 1.28 million. Image all of our active duty military sitting at keyboards running programs to attack a single website, that’s the power that “Anna_Senpai” the single person behind Mirai wields. Now by contrast Mirai isn’t the largest BOTnet we’ve ever seen, others like Conficker or Cutwall were larger, but this is the first one built entirely of IoT devices.


IoT Growing Faster Than the Ability to Defend It

The IoT is expanding faster than device makers’ interest in cybersecurity. In a report released Monday by the National Cyber Security Alliance and ESET, only half of the 15,527 consumers surveyed said that concerns about the cybersecurity of an IoT device have discouraged them from buying one. Slightly more than half of those surveyed said they own up to three devices—in addition to their computers and smartphones—that connect to their home routers, with another 22 percent having between four and 10 additional connected devices. Yet 43 percent of respondents reported either not having changed their default router passwords or not being sure if they had. Also, some devices’ passwords are difficult to change and others have permanent passwords coded in.


The Limits Of Encryption

It’s a simple point that many people haven’t grasped. Encryption can protect the contents of an email message, but it can’t hide who sent the message and who received it. That can be valuable information. Say that law enforcement officials are interested in a particular encrypted email that a suspect sent. If it can learn from the suspect’s carrier who the recipient was, it might be able to seize that person’s phone and read the message free of encryption. No muss and no fuss. As for meta-data, it can show times, dates and even location. So, despite Apple proudly declaring that it protects its customers’ data no matter what, it is still giving the government a lot of information “thousands of times every month.”


7 Tech Nightmares Haunting IT Pros This Halloween

"The challenges these IT decision makers face each day are truly daunting," said Sabrina Horn, managing partner and technology practice lead at Finn Partners, in a prepared statement. "From aging technology infrastructures, to cybersecurity threats, to the need to keep up with the latest innovations, it's no wonder we received a lot of scary, uncertain opinions about what lies ahead. But these findings also highlight the need for technology providers to better communicate the business outcomes they deliver, making it a little less uncertain for everyone." Finn Partners surveyed 511 US-based IT decision-makers between Sept. 6 and Sept. 13, 2016. Respondents to the survey identified themselves as senior employees with decision-making influence in one or more of the following areas:


Using smart city technology to power local economic development

Fundamentally, smart cities use technology and process innovation to improve the quality of life for all stakeholders within a community. One could make the case that thanks to broad adoption of SmartPhone technology and broadband wireless, most cities are already ‘smart.’ However, so far, it is the private sector that is leading. City management is responding rather than proactively initiating a coherent strategy for harnessing smart technology in a way that improves quality of life for residents and visitors. It is an incredibly exciting time as a number of social, cultural, geo-political and technological factors are converging to drive a tremendous amount of innovation in this space.


Experts on AI: Robotics Professor from Carnegie Mellon

The problem is not with AI but with humans who may misuse or abuse the technology. We’ve already seen the situation where AI has given the NSA and others the power to monitor and analyse our communications. You could say this invades our privacy and violation of the Constitution or you could say it protects us from terrorists. It’s up to us to decide how to use that power. Another ethical issue we should be thinking about is how computational biology is using AI to create designer babies, AI techniques are helping create tools to make this happen. Who wouldn’t opt to have a perfect, healthy child but if you eliminate naturally occurring diversity, what might the consequences be?


Internet Providers Could Be the Key to Securing All the IoT Devices Already out There

There are two main ways that ISPs could contribute to IoT security. The first is by blocking or filtering malicious traffic driven by malware in known patterns. For example, some ISPs use a standard called BCP38 to reduce spoofing, the process used by attackers to transmit network packets with fake sender addresses. Protecting against spoofing can negate many of the strategies that allow for assaults like the one on Dyn, but it’s taken years to get the majority of ISPs to adopt the standard—and some still don’t because of the cost of installing and maintaining the filters. The second thing ISPs could do is notify customers—whether big corporate clients or individuals—if a device on their network is sending or receiving malicious traffic.



Quote for the day:


"Great thoughts speak only to the thoughtful mind, but great actions speak to all mankind." -- Theodore Roosevelt


October 27, 2016

Tech Bytes - Daily Digest: October 27, 2016

Dealing with multiple service providers - A necessary evil, Can fintech prevent the next financial crisis, The difference between open source & open governance, 5 strategies to reboot your IT career, A quick primer on isolation levels & dirty reads, Residential routers easy to hackand more.

How IoT technologies are disrupting the aerospace and defence status quo

While current solutions only permit the airborne transfer of data for key vital parameters to maintenance crews, expanding this remit would allow them to determine the continual status and performance of individual parts and components within the engines, systems, and subsystems across the wider aircraft. This continuous visibility of the aircraft’s performance is crucial. If, for example, one of the engine vitals fails mid-air, a standby system would kick in and run all of the necessary functions to enable it to complete its journey safely. An alert would then be sent to the ground staff, who could use the real-time information to determine the cause of the failure, before engaging the necessary personnel and sourcing the components required to get the aircraft back up and running as soon as it lands.


Dealing with multiple service providers: A necessary evil

If dealing with an ever-expanding IT ecosystem is a mandate for enterprises, then developing the organizational maturity and capability of integrating and managing services purchased from disparate and specialized vendors is a necessary part of it. This means automating multi-vendor governance capabilities and leveraging tools and processes that help integrate the delivery and management of services from an end-to-end perspective. The fast-developing ecosystem proffers a strategic choice: to buy services (outsource to a third party) or to build services (develop in-house capability and implement within the enterprise). And, at the risk of stating the obvious, there’s no one-size-fits-all answer.


Can Fintech Prevent The Next Financial Crisis?

Under the current system, bankers do not risk their own money; rather, the risk is entirely on their savers aka the bank’s depositors. Under extreme circumstances, the government may be required to foot the bill if and when things turn sour at the bank. As for the bankers themselves they have very little at stake; in fact, their willingness to take risks (with their depositors’ funds, of course) often leads to lucrative bonuses. Bankers at no time do they risk their own savings or pensions. And that’s the real problem; how can professionals be expected to take low risk on behalf of others when they have so much to gain and so little to lose? We can’t expect them to take the high road; indeed, the sub-prime crisis proves that. So how exactly will P2P lending make a difference?


The difference between open source and open governance

On the open domain, the only two non-functional things that matter in the long term are whether it is open source and if it has attained momentum in the community and industry. None of this is related to how the software is being written, but this is exactly what open governance is concerned with: the how. Open source governance is the policy that promotes a democratic approach to participating in the development and strategic direction of a specific open source project. It is an effective strategy to attract developers and IT industry players to a single open source project with the objective of attaining momentum faster. It looks to avoid community fragmentation and ensure the commitment of IT industry players.


Ransomware: The Next Big Automotive Cybersecurity Threat?

“The current ransomware business model works well because the attackers ensure that the price paid is well worth the data restored,” explained Tony Lee, technical director at security research firm FireEye. “Can home users put a price on precious family photos or financial documents? Can organizations put a price on critical information necessary to conduct business? If that answer is yes and the price is low enough, the ransom will be paid.” The same rationale can be extended to vehicles. Approximately 250 million connected cars are expected to be on roads worldwide by 2020, according to a 2015 analysis by technology consulting firm Gartner, making connected cars the next potential market for hackers. These attacks could range from simply locking motorists out of their vehicles to locking them inside; a more ominous scenario would allow hackers to freeze the ignition, essentially “bricking” the car and making it completely unusable.


5 strategies to reboot your IT career

Technology changes faster than many of us can keep up with it. New paradigms like software-defined networks and the cloud emerge, and the old ones continue to hang around. But while the hotshot programmers and big data geeks get to play with the shiny new toys, you're busy waiting for the robots to come and take away your job. ... It doesn't have to be that way. Whether you cut your teeth on Unix and AIX or you tire of doing the necessary but thankless tasks that come with keeping the lights on and the datacenter humming, there's still time to reinvent yourself. It won't be fast or easy. It will mean investing a lot of time and possibly some money, taking risks, and hacking code. But it can turn into a much greater reward, both financially and psychically.


A Quick Primer on Isolation Levels and Dirty Reads

If you need to repeat the same read multiple times during a transaction, and want to be reasonably certain that it always returns the same value, you need to hold a read lock for the entire duration. This is automatically done for you when using theRepeatable Reads isolation level. We say “reasonably certain” for Repeatable Reads because of the possibility of “phantom reads”. A phantom read can occur when you perform a query using a where clause such as “WHERE Status = 1”. Those rows will be locked, but nothing prevents a new row matching the criteria from being added. The term "phantom" applies to the rows that appear the second time the query is executed. To be absolutely certain that two reads in the same transaction return the same data, you can use the Serializableisolation level.


Residential routers easy to hack

Weak passwords can be easily exploited. Fourteen percent of simulated attacks on the routers were, in fact, victorious. The probing attack methodology was simply to use common default usernames and passwords, along with some frequently used combinations. Telnet was left open on 20 percent of the routers, and command injection vulnerabilities were also caught. Telnet, as an unsecured service, shouldn’t be openly available to even a local network, ESET explains. Command injection vulnerabilities “aim for the execution of arbitrary commands on the host operating system.” They use a vulnerable application, the security company says. Proper input validation fixes the deficiency. Of that 7 percent of the now-common household devices with software vulnerabilities, about half (53 percent) had “bad access rights vulnerabilities,” or permissions problems, in other words.


Can government-funded innovation solve the cyber security threat?

Expecting the federal government to produce solutions is hopeful at best and woefully naive at worst, though that isn’t to say that it can’t somehow play a part. Even if it can’t actually develop the technologies necessary to compete in this new battle arena, it can still fund innovative R&D that can be developed into the next generation of defense infrastructure. This can be achieved through the Small Business Innovation Research (SBIR) program, a highly competitive research initiative through which domestic small businesses respond to federally specified R&D requirements with commercial applications. Awards are distributed in two phases, first for feasibility and proof of concept of the product, and then for further development and commercialization.


Five Questions General Counsels Should Ask About Privacy and Cybersecurity in Third-Party Contracts

Regulators are cultivating an ever-increasing patchwork of data protection laws and regulations. Because third parties may host and process data in various locations around the world, companies must keep abreast of constantly evolving developments in global data protection laws and regulations, including data localization laws and data transfer regulations. Compliance failures may subject a company to considerable fines and penalties (e.g., the EU General Data Protection Regulation, effective in May 2018, will allow penalties of up to four percent of worldwide revenues for compliance failures). In addition, data localization laws, which require that data must remain in the country, are emerging. For example, Russia has such a law, and others have been proposed in Indonesia and China.



Quote for the day:


"Without Simplicity and Transparency, you could become a Happy Underachiever." -- @GordenTredgold


October 26, 2016

Tech Bytes - Daily Digest: October 26, 2016

Advanced use cases for repository pattern in .NET, Everything we know about the great Indian debit card hacking, Integrating hotel systems can create hacking liabilities, Best practices for securing your data in motion, Cyber security staffing issues may be putting you at risk and more.

7 Deadly Sins of Project Management You Should Never Commit

The biggest blunder that can derail your project is selecting the wrong person as your Project Manager. According to American Eagle Group data, around 80% of Project Managers lack formal training, which is one of the major reasons why 55% of projects fail. On the other hand, a Standish Group CHAOS report revealed that Project Managers equipped with formal training have a success rate of more than 70%. This goes to show the importance of trained Project Managers and how it could increase your chances of completing your projects on time and within the budget. Select a Project Manager whose experience and skills coincide with your project management requirements. On the other hand, a Standish Group CHAOS report revealed that Project Managers equipped with formal training have a success rate of more than 70%.


Advanced Use Cases for the Repository Pattern in .NET

When designing a repository, you should be thinking in terms of “what must happen”. For example, let us say you have a rule that whenever a record is updated, its “LastModifiedBy” column must be set to the current user. Rather than trying to remember to update the LastModifiedBy in application code before every save, you can bake that functionality right into the repository. ...  Normally repositories are context free, meaning they have no information other than what’s absolutely necessary to connect to the database. When correctly designed, the repository can be entirely stateless, allowing you to share one instance across the whole application. Context aware repositories are a bit more complex. They cannot be constructed until you know the context, which at the very least includes the currently active user’s id or key. For some applications, this is enough.


Everything we know about the great Indian debit card hacking

The data breach happened in August and September, according to the Mint newspaper. But the banks apparently weren’t aware, several bankers told Mint. This is the list of all of those involved: bank customers, 19 Indian banks, the NPCI, Hitachi Payments Systems, Mastercard, Visa, RuPay. But they are all shirking responsibility for the mess. Most banks, including SBI, HDFC Bank, and ICICI Bank, have said their systems are safe. The platforms these banks use for debit cards—Mastercard, Visa, and Rupay—have also washed their hands off the crisis. Hitachi Payments Services, which managed Yes Bank’s ATMs, said that an initial review “does not suggest any breach/compromise.”


Integrating hotel systems can create hacking liabilities

Integration. It’s one of the industry’s biggest buzzwords for streamlining operations. With everything on property collecting data and providing options for interaction, wouldn’t it be nice if every device collaborated? It’s the dream of many operators to have a property that is running fully in-sync, but Shaun Murphy, communications security expert, inventor, CEO and co-founder of communications app SNDR, said the persistent threat of data breaches may be reason enough to question which devices on property are working in tandem. “During a breach, the worst-case scenario is that all your systems are integrated,” Murphy said. “From your point of sale to your soda machine, at that point you are losing not only financial information, which you have to disclose, but other confidential information as well.”


How Big Data Is Changing Recruitment Forever

Dana Landis, vice president of global talent assessment and analytics at Korn Ferry, said “When you’re talking about big data you’re talking assessing millions of people all over the world, so you need self-assessment. We’ve designed our tools to take out a lot of the problematic aspects of that – instead of being able to rate yourself high on all the good things and low on all the things that sound bad, you’re forced to make really difficult decisions based on ranking and prioritizing your skills.” Moving their assessment process to an online, self-assessment model has greatly increased the volume of candidates that Korn Ferry has been able to assess. This further increases the size of the dataset used to measure candidates’ suitability. By comparing their individual profiles against amalgamated profile data from people who have proven themselves successful in similar job roles, a more accurate picture of the skills a person will need to succeed in a particular role emerges.


Best practices for securing your data in-motion

Data in-motion has to contend with human error, network failures, insecure file sharing, malicious actions and more. In today’s economy, almost every business has data that needs to be transferred outside protected business applications and systems to enable collaboration between co-workers, users, systems, partners and more – so simply not letting data be shared is not an option. To remediate the security risk that’s inherent with sending data outside of your walls, companies must accept the reality of data insecurity in-motion and take proactive steps to prevent an expensive and embarrassing data breach. The first step is to accept that your company data, including sensitive data, is being sent insecurely via shadow IT. When IT isn’t involved with how data is being transferred, there are critical disadvantages, which often trigger other serious issues


Intel wants to make its IoT chips see, think, and act

Intel is working to help machines evolve from accurately sensing what’s going on around them to acting on those senses. For example, if a device can see defective parts going through an assembly line, it can alert someone or even stop the line. Cameras in cars could see that the driver is drowsy and set off an alarm in the car, and ones pointed in front of the vehicle could tell a pedestrian from a shadow and stop the car – if its vision was accurate enough. ... The new chips are also better at capturing and processing images. They have four vector image processing units to perform video noise reduction, improve low-light image quality, and preserve more color and detail. In a networked video recorder, an E3900 could take 1080p video streams from 15 cameras and display their feeds simultaneously at 30 frames per second on a video wall, Caviasca said.


Agile Manufacturing: Not the Oxymoron you Might Think

Industry 4.0, digital manufacturing, agile manufacturing, “digital thread”—these are all terms that describe the way we are making some things now and will make almost everything in the future. ... Digital manufacturers are organizing from an outside-in mindset that starts with the customer, and looks to deliver creatively on market opportunities, whatever they happen to be, however they will be delivered, and whoever will deliver them. Profits are seen as the consequence of providing value to customers, not the goal of the firm.  Soon, when you walk into your mechanic’s shop to replace a broken fender, he will not need to order the replacement part from overseas and call you back in three weeks. He will take some measurements, step to an attached room with a 3D printer and make your new fender on the spot, revised to attach more firmly and with accent trim to update the style.


Cybersecurity staffing issues may be putting you at risk

Chances are you already have future security pros within your own ranks -- it would stand to reason that businesses have turned to internal talent to find cybersecurity experts. But, according to the data from Spiceworks, that's not necessarily the case. When asked how willing they would be to invest in IT training for 2016, 57 percent said they were "somewhat open, but it would take some convincing," while only 6 percent said they were "extremely open" and had already made investments in training. "Smart people within your own ranks have the huge advantage of already knowing the context of the enterprise to be protected. By using in-house staff, you can save on the time it takes to teach them the context of the enterprise," says Ryan Hohimer, co-founder and CTO of DarkLight Cyber.


The QA Success Story: Where Business and Technology meet

Technology is playing an ever increasing role in the business cycle – influencing buying decisions, transacting through online platforms, integrating with payment channels, collaborating with partners in co-creating and delivering products / services, and being evaluated by the customer across multiple touch points. The exceptional visibility of technology across customers, partners and stakeholders has brought greater focus onto non-functional user experience dimensions – usability, performance, security, inter-operability, and response times. The ability of technology to dis-intermediate and bring businesses closer to the customer is seeing an explosion in platforms targeting the Cloud, leveraging Social Media and Analytics and delivering services on the Mobile.



Quote for the day:


"Cyber criminals are getting more sophisticated and realizing that small businesses are easy targets." -- Mark Berven