Daily Tech Digest - April 26, 2017

Does IT Industry Need Better Namings?

In almost every software team there are members titled as quality engineers (QA). Their role is mainly to understand the specifications and based on them define a set of test cases in order to validate the product and to detect possible flaws. If we search what QA and QC mean by looking at the definitions, we see that a QC is "an aggregate of activities (such as design analysis and inspection for defects) designed to ensure adequate quality especially in manufactured products", whereas the QA is "a program for the systematic monitoring and evaluation of the various aspects of a project, service, or facility to ensure that standards of quality are being met", as per merriam-webster.com definitions. Based on these definitions, people embedded in software development teams in charge of defining test cases and validating the product are more QC engineers. This might cause problems.


7 Patch Management Practices Guranteed To Help Protect Your Data

You can’t secure what you don’t know about. The only way to know if a breach or vulnerability exists is to employ broad discovery capabilities. A proper discovery service entails a combination of active and passive discovery features and the ability to identify physical, virtual and on and off premise systems that access your network. Developing this current inventory of production systems, including everything from IP addresses, OS types and versions and physical locations, helps keep your patch management efforts up to date, and it’s important to inventory your network on a regular basis. If one computer in the environment misses a patch, it can threaten the stability of them all, even curbing normal functionality.


Building An App-Centric Infrastructure Performance Monitoring System

Achieving synergy between applications and infrastructure is more than just blending disparate management regimes. A functioning application-centric environment requires enterprise executives to make changes to their current ecosystem on both a systems and an operational level. This can be difficult for organizations that maintain substantial legacy infrastructure geared toward conventional data workloads. One of the first things to do is to stop depending on silo-specific tools. When application requirements were fairly predictable, it was common for organizations to provision infrastructure to support the most demanding circumstances, even if that resulted in over-provisioned resources that would sit idle for long periods. This also often led to isolated application and infrastructure environments within the datacenter ecosystem as solutions were crafted to solve unique challenges at particular times.


Lessons from the Field: The Adaptable Business Architect

Business architects (BA's) have to interact with so many different stakeholders that staying out of turf wars can be difficult. Strategy development teams may question why you want to hear about their strategies. Business process teams may push back against capability modeling as being redundant with process optimization efforts. Other architecture teams may be challenged by your very existence. And of course consulting firms will pop up everywhere and claim they can do everything. Avoid turf wars at all costs and stay away from decision rights conversations. (See my previous post on being politically savvy.) In most companies, there is plenty of work to get done, so leveraging the time and talents of other teams is crucial to making progress. Get these teams involved, make them part of what you are doing, and help them to see the business outcome you are striving for. \\


What life after the smartphone will look like

Life after the smartphone will be wondrous. We’ll be amazingly productive. Our faces won’t be filled with light, our fingers won’t be a chaotic symphony. We won’t be strangled by USB charging cables. We'll never have nomophobia. As you could probably guess if you’ve read this column lately, you know that smartphones will be replaced by artificially intelligent bots. They already live among us. Soon, they won’t run on our phones or laptops. They will just run. They will exist in the cloud, at your office, in your car, and everywhere you happen to need help and stay productive. First, they need to get a lot smarter. A companion bot will follow you constantly -- sometimes literally. You’ll talk to the bot, but simple tasks like asking about the weather or the Golden State Warriors playoff schedule will seem trite.


Top 3 CIO priorities for addressing today’s data deluge

With greater amounts of data comes larger challenges in understanding the lineage, quality and relationships between data from multiple sources and of different types. And CIOs arguably struggle more than ever to effectively manage and analyse data to make it actionable. At one company the hype about machine learning had executives excited about using proprietary algorithms to gain competitive advantage. A data scientist was hired and told there are years’ worth data stored in Amazon S3, and was tasked with figuring out how to drive innovation with it. Unfortunately, there was no metadata to show where the data came from and how the data lake integrated with the rest of the company’s data. There was also no infrastructure for data analysis, forcing the data scientist to try to find tools compatible with the technology stack and install them.


A Tutorial For Enhancing Your Home DNS Protection

Traditional DNS has weaknesses like that. With certain types of DNS attacks an adversary can make you think you are going to a favorite website but can re-direct you to a bad one, perhaps to steal your login info or to download malicious code. This is another very important reason to use a managed DNS service. There are cautions to consider when selecting a DNS provider. Some DNS providers collect information from you in ways that may creep you out. For example, if you select the free DNS service from Google, although there are privacy protections, they will be aggregating even more data on you and your browsing habits. It is free and offers protection and is backed by a company with incredible engineers, but you will give up some info you might want kept private.


Continuous Integration & Collaboration in Code Repositories for REST API Docs

Writing documentation can be downright boring sometimes, but great documentation is an excellent precursor to increased adoption of an API. Writing excellent documentation is as exacting as the code itself. There are syntax errors and unwanted whitespace that you can introduce. Sometimes your ideas simply stop flowing, but you still need to fill in the blanks to make sure your documentation is complete. With the growth of APIs as products, your documentation is more important than ever in order to create a successful API. API definitions and documentation go together, and while API specifications today are increasingly managed as code in GitHub, the API docs are not. Let’s change this to make the standard to write and manage API documentation, including related web sites, in GitHub and related code repositories.


6 Things To Look For In A Clouid Consultant

We all know personal hygiene habits that we’re supposed to have, but probably don’t practice consistently (did you really floss three times yesterday?). And there are social behaviors that we really look out for – and probably even judge people on. But when it comes to IT habits, most organizations don’t seem be screening consultants for key behaviors and policies. This is not a good state of affairs because IT habits and internal policies make a material difference to the likelihood of project success.  The short list of policy issues below should be part of any screening criteria for cloud consultants, in general, and Salesforce consultants, in particular. Now, it’s not essential that a consultant comply with every item on the checklist below, but wherever policies diverge from these, it’s an opportunity to engage in a healthy conversation … before you sign.


Russian Hackers Use OAuth Fake Google Apps To Phish Users

Victims that fall for the scheme will be redirected to an actual Google page, which can authorize the hacking group's app to view and manage their email. Users that click “allow” will be handing over what’s known as an OAuth token. Although the OAuth protocol doesn't transfer over any password information, it's designed to grant third-party applications access to internet accounts through the use of special tokens.  The OAuth protocol may have been designed for convenience, but security experts have warned it can be used for malicious effect. In the case of Fancy Bear, the hacking group has leveraged the protocol to build fake applications that can fool victims into handing over account access, Trend Micro said.



Quote for the day:


"Life is a mystery. You never know which small decision will make the biggest difference." -- @Leadershipfreak


Daily Tech Digest - April 24, 2017

Why Your HR Department Should Embrace Design Thinking

Design succeeds when it finds ideal solutions based on the real needs of real people. In a recent Harvard Business Review article on the evolution of design thinking, Jon Kolko noted, "People need their interactions with technologies and other complex systems to be simple, intuitive, and pleasurable. Design is empathic, and thus implicitly drives a more thoughtful, human approach to business." When done well, human centered design enhances the user experience at every touch point, and fuels the creation of products and services that deeply resonate with customers. Human centered design is foundational to the success of companies like SAP, Warby Parker, and AirBnB. ... To delight employees, Cisco has identified "moments that matter" -- such as joining the organization, changing jobs, and managing family emergencies -- and redesigned its employee services around these moments.


Healthcare Records For Sale On Dark Web

In addition, the majority of software developers and system administrators are not accustomed to working in an environment containing federally regulated information such as ePHI, Copolitco wrote. Security controls may chafe developers as they have to adjust how they do things. “All companies who have a compliance obligation must remember that the point of HIPAA compliance is to impose a certain level of security, said Reed. “Security is the ultimate goal, not necessarily compliance. Compliance comes as a result of having a good security program. Being compliant does not mean you are secure; it merely means you have 'checked the boxes.'” An HHS Office for Civil Rights official stated at the recent HIMSS and Healthcare IT News Privacy & Security Forum in Boston that the organization will be conducting on-site audits of hospitals in 2017 and that OCR is engaged in over 200 audits at the moment.


Coffee With a Data Scientist: Avkash Chauhan

No matter what I have done throughout my career "data" has always played a very important part. Over the years, I've recognized how data has transformed the business and engineering part of development. Machine learning is no longer limited to large enterprises, and smaller companies are ready to get involved and take advantage of its benefits. Also, with the proven results from deep neural networks in various fields, it is clear that this is the time when machine learning and deep neural networks will play a very important role in technology going forward. I suppose my interests in data science are very well timed for the rise of machine learning. It is certain that technology changes everything time and time again, and for every programmer, self-transformation is an important step to keep relevant and competent in an ever-changing field.


The Hardest Thing About Doing a Startup

I have found the ‘N+1 Syndrome’ to be the most common reason, especially among accomplished professionals who are doing well in their current gig. The thinking goes like this: You are earning well, you have a good name at work, your families are comfortable, and most importantly, you get that nice paycheck at the end of the month! Yes, you have this exciting idea, the thought of not reporting to a stupid boss is enticing, the lure of hitting that IPO jackpot, becoming famous, and retiring by the time you are 40 is tantalizing! You are going to do it, yes! No-one is better qualified! You will just get this one little thing out of the way, and then you are set! Most even have excellent ideas for the new business, but somehow they keep moving the start date forward by a year, then another year, and another.


Google Says Machine Learning Chips Make AI Faster and More Efficient

For context, CPUs, or central processing units, are the processors that have been at the heart of most computers since the 1960s. But they are not well-suited to the incredibly high computational requirements of modern machine learning approaches, in particular deep learning. In the late 2000s, researchers discovered that graphics cards were better suited for the highly parallel nature of these tasks, and GPUs, or graphics processing units, became the de facto technology for implementing neural networks. But as Google’s use of machine learning continued to expand, they wanted something custom built for their needs. “The need for TPUs really emerged about six years ago, when we started using computationally expensive deep learning models in more and more places throughout our products.


Why There Is No API Security

To understand why APIs inherently lack security, you must understand that API exploits attempt to compromise the application in one of two ways. The first is through application programming errors that attempt to reveal data or impair the operation of the application. These exploits manifest themselves through malicious inputs like SQL injection, cross-site scripting, and other such attempts at exposing data. Generally, applications can be secured against programming errors. This is often an iterative approach that can take months to years of use, testing, patching and retesting, but it can be done. The second avenue is through attempts to exploit the business logic of the application to create unauthorized access or fraudulent transactions. The harder portion to identify and stop are the exploits of business logic. Applications are being designed to deliver micro-services which expose a large number of interfaces to the Internet.


An untold cost of ransomware: It will change how you operate

Even if the backup looks promising, there is no easy button. The people creating ransomware know that backups can stand between them and their payday. There are a lot of cases where Microsoft Volume Shadow Copies have been destroyed by ransomware. If you leave your backups online so you can have quick recovery, you may find that ransomware can actually delete or corrupt your backups. This is not uncommon; ead the user groups from various backup companies and you’ll see the sad tales of woes. If you are not concerned enough, there are other potential dangers to your backups. They need to be airlocked from systems your users have access to. Before you bring your backups online, make sure the affected computers are off of the network. You need to be absolutely certain that those systems can’t access the backup.


Researchers build a microprocessor from flexible materials

TMDs are compounds composed of a transition metal such as molybdenum or tungsten and a chalcogen (typically sulfur, selenium or tellurium, although oxygen is also a chalcogen). Like graphene, they form into layers. But unlike graphene, which conducts electricity like a metal, they are semiconductors, which is great news for flexible chip designers. Stefan Wachter, Dmitry Polyushkin and Thomas Mueller of the Institute of Photonics, working with Ole Bethge of the Institute of Solid State Electronics in Vienna, decided to use molybdenum disulfide to build their microprocessor. They deposited two molecule-thick layers of it on a silicon substrate, etched with their circuit design and separated by a layer of aluminium oxide. "The substrate fulfills no other function than acting as a carrier medium and could thus be replaced by glass or any other material, including flexible substrates," they wrote.


CIO Jury: 50% of tech leaders are implementing DevOps

DevOps implementations also vary from company to company. At business law firm Benesch, Friedlander, Coplan & Aronoff LLP, "I think the real focus is on agile communication and client outcomes, versus delivery," said CIO Jerry Justice. "[It's about] creating a solid feedback loop so you can adjust targets and timings." However, not all companies are ready to fully jump on board the new workflow. While Simon Johns, IT director at Sheppard Robson Architects LLP, said the firm has yet to implement DevOps, he also said that "there are elements of the 'philosophy' I would like to introduce into our workflows—build fast, fail fast type of situations." David Wilson, director of IT services at VectorCSP, said he doesn't plan to implement the workflow. "After nearly 30 years of IT experience, I doubt any of those large software companies are really investing in this," Wilson said.


Securing Risky Newok Ports

While some network ports make good entry points for attackers, others make good escape routes. TCP/UDP port 53 for DNS offers an exit strategy. Once criminal hackers inside the network have their prize, all they need to do to get it out the door is use readily available software that turns data into DNS traffic. “DNS is rarely monitored and even more rarely filtered,” says Norby. Once the attackers safely escort the data beyond the enterprise, they simply send it through their DNS server, which they have uniquely designed to translate it back into its original form. The more commonly used a port is, the easier it can be to sneak attacks in with all the other packets. TCP port 80 for HTTP supports the web traffic that web browsers receive. According to Norby, attacks on web clients that travel over port 80 include SQL injections, cross-site request forgeries, cross-site scripting, and buffer overruns.



Quote for the day:


Leadership: "If you are not building for the long term you are doing the wrong thing." --@Bill_George


Daily Tech Digest - April 23, 2017

Delaware Law Amendments Would Facilitate Blockchain Maintenance of Corporate Records

According to Vice Chancellor J. Travis Laster, the blockchain could help to remove the middleman when it comes to how shares are held and voted, as at present they are operating on an outdated system that is too complex to determine who owns a share and how it’s used in decision making. Delaware is just one state in the U.S. that is showing an increased interest in the distributed ledger. Only recently, the Senate in the state of New Hampshire considered a blockchain bill that would deregulate digital currency transactions such as bitcoin from money transmitter regulations in the state. By doing so, the bill is designed to protect consumers when using digital currencies such as bitcoin instead of making them register with money transmitter regulators.


Do Collaboration Tools Create Security Risks For Your Business?

The business world is abuzz with the benefits of collaboration tools: less reliance on email, more organic collaboration on projects and better communication and relationships between teams. Collaboration tools encompass many solutions, including video conferencing, VoIP, document sharing and instant messaging. However, it is also important to think about the security risks that are inherent in tools such as document collaboration platforms, presentation software, remote support tools and virtual events. Each of these can create potential security threats, and evaluating vulnerabilities – and viable solutions – should be a sustainable part of your tool-selection process.


What’s To Do Before Ethereum Enters Its Third 'Metropolis' Stage?

One tricky part is making changes to all ethereum clients, no matter what programming language they're written in, in lockstep. Ethereum Foundation's Khokhlov has been writing tests using a tool called Hive to ensure not only that the clients implement the changes correctly, but that all clients agree on consensus-level changes. That's because if all clients don’t follow the same rules, there could be an accidental split into different networks (as happened briefly in November). Just like former phase changes Frontier and Homestead, the shift to Metropolis requires a 'hard fork' – meaning nodes or miners that fail to upgrade to the new blockchain will be left behind. Because of the possibility of an inadvertent split, hard forks are controversial and taken very seriously.


How IoT and Big Data are tackling Africa’s problems

“All solar systems are monitored in real time through the cloud,” Fruhen announced at a recent tech event in Nairobi. “Five years [ago] when we were founded nobody was thinking about IoT or Big Data but now we collect over 30 million payment notifications every year.” He added that they have more than one million device readings every day. This is from the batteries to temperature of the devices and sensors. Additionally, they have geographical data on where the devices are located. They also have 450,000 rooftop sunshine readings every day. They have calculated that they have saved their users US$338 million since they started, five years ago. “Cloud is the enabler for all these,” he reiterated. “We have 680 terabits of data on our platform.” The company has used its data to provide upgrade devices to users who have finished their solar loan.


5 Ways Cloud Vendors are Dealing with Data Privacy Concerns

Today, cloud vendors are designing managed cloud services from the ground up to meet the most advanced data security requirements, giving current and prospective customers the peace of mind that their data is private and secure. They should also deliver across-the-board support for every aspect of cloud security including physical security, network security, data protection, monitoring, and access controls. Data encryption for data in flight and at rest along with tokenisation of sensitive data items are strategies that can help improve Data Security and help to meet the most stringent of data privacy requirements. Cloud vendors understand that any successful cloud security solution requires close collaboration between you and your cloud service provider, knowing that it’s critical that your organisation has a programme that covers everything from data governance and compliance to cloud user access.


9 Essential #EdTech Ideas to Share With Your Team

To deny that tech will be important to students' futures seems unthinkable. But it's not enough to recognize students will need tech to be successful. Your students also need to see you as a willing learner of technology. They need to see you as a learner period. And it's a shame if you aren't leveraging your skills as a teacher because you aren't willing to learn technology. All of your teacher skills are priceless, but they can be even more relevant and powerful if you know how to effectively use technology for learning, too. ... Lots of kids like to use technology. But using tech because it is engaging isn't as important as using it because your students are engaged. If your students are curious and motivated learners, they will have questions that need answers. They will want to create and share new knowledge.


Learning to Think Like a Computer

Computational thinking is not new. Seymour Papert, a pioneer in artificial intelligence and an M.I.T. professor, used the term in 1980 to envision how children could use computers to learn. But Jeannette M. Wing, in charge of basic research at Microsoft and former professor at Carnegie Mellon, gets credit for making it fashionable. In 2006, on the heels of the dot-com bust and plunging computer science enrollments, Dr. Wing wrote a trade journal piece, “Computational Thinking.” It was intended as a salve for a struggling field. “Things were so bad that some universities were thinking of closing down computer science departments,” she recalled. Some now consider her article a manifesto for embracing a computing mind-set. Like any big idea, there is disagreement about computational thinking — its broad usefulness as well as what fits in the circle.


Regression - Professional analyst should be able to answer these three questions.

To produce a regression analysis of inference that can be justified or trustworthy in the sense that helpful. The term in the statistical methods that generate a linear the best estimator is not bias (best linear unbiased estimator) abbreviated BLUE. Then there are some other things that are also important to note, in which the data to be processed, must meet certain requirements. In terms of statistical methods some terms or conditions of the so-called classical assumption test. Because they meet the assumptions of classical statistical coefficient will be obtained which actually became estimator of parameters that can be justified or accurate ... With adjustments being an attempt to fulfill certain requirements (classical assumption) in the regression analysis as a form of simplification in the application of modern economics, which is a form of empirical science.


Google’s New Chip Is a Stepping Stone to Quantum Computing Supremacy

The six-qubit chip is also a test of a manufacturing method in which the qubits and the conventional wiring that controls them are made on separate chips later “bump bonded” together. That approach, a major focus of Google’s team since it was established just over two years ago, is intended to eliminate the extra control lines needed in a larger chip, which can interfere with how qubits function. “That process is all working,” says Martinis. “Now we’re ready to kind of move fast.” Designs for devices with 30 to 50 qubits are already in progress, he says. He briefly flashed up images of the six-qubit chip at the recent IEEE TechIgnite conference in San Bruno, California, but his group has yet to formally disclose technical details.


Data Transformation is the New Digital Transformation

What is data worth? In my 2010 predictions I expected to see “datasets increasingly recognized as a serious, balance sheet-worthy asset”. I was a bit early there. Data is clearly still not a well understood or significant investment category – brand “goodwill” is better accounted for, but there is no doubt that markets value companies perceived to be data rich with higher evaluations than other companies. Data is a moat. IBM acquired the Weather Company for around $2bn according to the Financial Times, and promptly put CEO David Kenny in charge of a swathe of its Watson and Cloud units. Uber now has a business selling data to companies including Starwood, and is leveraging data to make deals the public sector organisations such as the city of Boston. But taking advantage of data is hard – requiring entirely new skill sets. Valuing it is hard. Cleaning it is hard. Querying it is hard. Managing and maintaining it is hard.



Quote for the day:


"A problem is only a problem when viewed as a problem." -- Robin Sharma


Daily Tech Digest - April 22, 2017

How Indonesia is preparing its fintech ecosystem

“Trust in online payments and consistent education to accept new ways to pay are the two major challenges that we are currently facing,” explains Doku chief operating officer, Nabilah Alsagoff. “Most Indonesians are still comfortable and pretty much rely on bank transfer and COD as their preferred method of payments.” One of Doku’s main aims is to make e-commerce systems easier to navigate for both customers and merchants, she says. The ultimate goal is to be a part of Indonesians’ daily payment habits via e-money, especially for the unbanked in a country of over 250 million where only 65 million are bank account holders. But not only is access to customers a barrier, so too are laws and regulations. Most fintech players feel that the regulation in Indonesia is still in [a] grey area.


Indian techies, IT firms fret as Donald Trump orders US visa review

More broadly, uncertainty over the review announced this week has unsettled Grishma and many others like her. She will have to wait until at least around August to learn her fate, but having accepted the US job offer she is not in a position to apply for positions elsewhere, including in Europe. "It's pretty debilitating," Grishma told Reuters. "I'd like to start work to mitigate the financial damage." Trump's decision was not a huge surprise, given his election campaign pledge to put American jobs first. But the executive order he signed, though vague in many areas, has prompted thousands of foreign workers already in the United States or applying for visas to work there to re-think their plans. Companies who send them also face huge uncertainty.


How one company uses big data to maximize yields and minimize impact

The systems Vegis and her team have built are hosted on Bluemix, IBM's data storage, processing, and analytics cloud. "IBM's tools have enabled us to save both time and money on programming and development," Vegis said. With the initial hurdle of developing machine learning systems and processing data already accomplished, Foris.io has been able to actually gather data instead of just planning for it. According to Vegis, cognitive computing platforms like Watson allow them to "take concept to prototype in a shorter period of time, which we know will improve our chances of securing funding." That doesn't just apply to her and Foris.io—it's a huge benefit for all tech innovators. With a probe installed, data gathering begins. The devices, capable of transmitting data several kilometers, measure moisture, pH level, salinity, temperature, and other factors


25 Predictions About The Future Of Big Data

A flexible structure is just as important today as business needs are changing at an accelerating pace and it allows IT to be responsive in meeting new business requirements, hence the need for an information architecture for ingestion, storage, and consumption of data sources. One of the challenges facing enterprises today is that they have an ERP (like SAP, Oracle, etc.), internal data sources, external data sources and what ends up happening is that “spread-marts” (commonly referred to as Excel Spreadsheets) start proliferating data. Different resources download data from differing (and sometimes the same) sources creating dissimilar answers to the same question. This proliferation of data within the enterprise utilizes precious storage that is already overflowing - causing duplication and wasted resources without standardized or common business rules.


Introducing ‘Operator 4.0,’ a tech-augmented human worker

Human work will become more versatile and creative. Robots and people will work more closely together than ever before. People will use their unique abilities to innovate, collaborate and adapt to new situations. They will handle challenging tasks with knowledge-based reasoning. Machines enabled by the technologies that are now becoming commonplace – virtual assistants like Siri and Alexa, wearable sensors like FitBits and smart watches – will take care of tedious work details. People will still be essential on the factory floors, even as robots become more common. Future operators will have technical support and be super-strong, super-informed, super-safe and constantly connected. We call this new generation of tech-augmented human workers, both on factory floors and in offices, “Operator 4.0.”


Fintech CEO Talks Cross-Border Pain Point Removal

Looking at payments through a global (rather that U.S.-based) lens, 2017 is not going to be a year of leap-frog innovations, but rather a year of incremental improvements focused on country-by-country wins. As mobile infrastructure continues to expand and the Internet reaches an additional two billion people in markets where access was previously nonexistent, we’re bound to see a spike in demand for online and mobile purchases. At the same time, the payment methods landscape will only become more fragmented, requiring payment platforms to optimize between multiple payment options, acquirers and processors, handle currency conversions cost-effectively and transparently, and account for numerous legislative nuances across multiple markets. Decades-old payments systems won’t cut here.


People Re-engineering How-to’s: Mentoring As A Service

The mess comes in what the older cohort in the business see in the self-organizing abilities and discipline in the personalities of the newcomers. I personally disagree with this 'mess theory' and see it as a normal difference in perspectives between generations that were professionally made in different ecosystems, with sharp differences in tempo and culture. Actually it’s our role (as veterans in the craft) to stretch a good hand to get the newcomers professionally in shape seamlessly and gracefully. So what’s the problem, then? Well, that becomes an issue when resources to coach these hordes of not-yet-matured practitioners are not enough. Especially when we remember the sometimes insane pressures on teams and leaders to meet their schedules, leaving very little space for helping juniors outside what’s barely needed to get them 'technically productive'.


Legal impact of data protection and management in the digital age

Regardless of the cause, the threat of data breaches is imminent and can have severe repercussions for organizations, especially if they are found guilty of failing to take sufficient measures to secure their data. Singapore's data protection law has one of the highest fines in Asia with each breach subject to a potential fine of S$1 million. Similarly, breaching Europe’s new General Data Protection Regulation can result in a fine of the larger of either 20 million Euro or 4 per cent of the organization’s global annual turnover. Beyond financial penalties, a data breach can cause irreversible damage to a company’s reputation as well as potentially significant damages payable in civil liability to third parties, not to mention possible personal criminal liability for senior management. Organizations should be well aware of the prevailing legal regulations that govern ever growing popular technology solutions such as cloud storage, collection, analysis, and offshore storage of customer data.


Huawei’s CEO Eric Xu talks wearables, Cloud, AI, and more

AI will be everywhere in our products, in our technologies, and in our operations. And I believe AI can bring value in each and every one of those aspects. In the Telco markets, we've been talking about the technology of AI to build what we call a network brain. The whole notion of this network brain is to help telecom operators to be more intelligent as they build, run, and manage their network. Also, we have tried to bring artificial intelligence into smartphones. Last year we launched Huawei Magic; a concept phone with AI capabilities built into it. The idea was to show how the phones would evolve from smartphones to intelligent phones.  And then our network and cloud service - no matter whether it's Public Cloud or Private Cloud - we also inject the capabilities of AI into the Cloud platform to better enable enterprises.


Why You Must Build Cybersecurity Into Your Applications

“Companies face a terrible choice: either they turn their business into software and they accept the fact that they’re going to have rampant vulnerabilities and breaches or they let their competition win the innovation race. And everyone chooses software,” said Williams. “But as a result, we’re going to have 111 billion new lines of code in 2017. And the problem is that these legacy tools, dynamic analysis tools, static analysis tools and web application firewalls, were invented in the early 2000s. They’re absolutely incapable of scaling to the level of modern software.” This requires an approach that uses automation. Every business that has been around for more than five years will have legacy software integration challenges, which requires developing new code. Companies are constantly integrating new software platforms with older systems and a cybersecurity platform has to be able to protect all of these assets.



Quote for the day:


"Sometimes the questions are complicated and the answers are simple." -- Dr. Seuss


Daily Tech Digest - April 21, 2017

A Vigilante Hacker May Have Built A Computer Worm To Protect The IoT

Symantec has found some possible proof. The company noticed that the computer worm has been leaving a message over infected devices since at least March, Grange said. That message has been digitally signed and fetched in a way that leaves little doubt it comes from Hajime's developer. The short message doesn't reveal anything about the Hajime developer's identity. But the vigilante hacker is aware the security community has been studying the Hajime worm. One clue: The mysterious developer refers to himself or herself as the "Hajime author" in the message the worm has been leaving behind. However, it was actually security researchers at Rapidity Networks that came up with the name Hajime, which is Japanese for the term "beginning."


Australia's bold plan for cybersecurity growth

The SCP is intended to "identify the challenges Australian organisations face when competing in local and international cyber security markets". "The SCP provides a roadmap to strengthen Australia's cyber security industry and pave the way for a vibrant and innovative ecosystem. It articulates the steps and actions required to help Australia become a global leader in cyber security solutions, with the aim of generating increased investment and jobs for the Australian economy," it says. The SCP was launched by Senator Arthur Sinodinos, Minister for Minister for Industry, Innovation and Science. "The aspiration, and it's set out here in this plan so clearly, is to be a global leader in this space," Sinodinos said.


Even small firms can tap into value through data wrangling

Chances are, small businesses already have a fairly large amount of data collected, particularly if they have been in business for at least a year. Even if the business is older and had not begun in the digital age, and does not have many electronic records, the paper records still contain data. Sales slips, time cards, order forms, all of these have data worth analyzing. Perhaps the records are a mix of paper and electronic records. Maybe more recent inventory records are recorded in a spreadsheet, while the older information is kept in a hand-written ledger. It would be worth the business owner’s while to digitize the paper records. This will require an initial output of resources, but the time spent scanning images or entering data into a database program will be paid back in the time saved by the staff not having to dig through paper files looking for information in addition to gaining the ability to query these records.


Microsoft launches 'IoT-as-a-service' offering for enterprises

The Microsoft spokesperson added that the new offering will help IoT product manufacturers "that value time to market with technical stack prescribed and managed for them". "It is designed to enable the rapid innovation, design, configuration, and integration of smart products with enterprise-grade systems and applications to reduce product manufacturers' go-to-market cycle and increase the speed at which they can innovate so they can stay ahead of their competition and deliver smart products that delight their customers," the spokesperson told ZDNet. IoT Central is vertically and horizontally agnostic, though the spokesperson said its early adopters happen to operate in the manufacturing and engineering industries such as ThyssenKrupp Elevator, Sandvik Coromant, and Rolls-Royce.


Artificial intelligence: fulfilling the failed promise of big data

According to Forrester’s Business Technographics survey of over 3,000 global technology and business decision makers from last year, 41percent of global firms are already investing in AI and another 20 percent are planning to invest in the next year. Most large enterprises’ first foray into AI is with chatbots for customer service, what we call “conversational service solutions.” These run the gamut from hard coded rules-based chatbots which aren’t artificially intelligent to very sophisticated engines using a combination of NLP, NLG, and deep learning. From a customer insights perspective, many companies are starting to uses some of the “sensory” components of AI such as image and video analytics and speech analytics to unlock insights from unstructured data.


How the Internet of Things Puts SCADA Systems at Risk

Since OT is technology that was built pre-Internet and is goal-oriented, its security is not always a top priority, Brown said. Others agreed. "I think it's still sort of a nascent field which is ironic because industrial systems, operational systems are from a past era," said Alex Eisen, a security researcher for ForeScout. Eisen later continued, "Think about trains, iron, mechanical engineering, electrical engineering and now we find ourselves in this modern world, information age, where a lot of these hard skills and experience is sort of tucked away." The panel discussed risks to assuming OT and IT systems are not connected. Brown went on to describe multiple attacks that have happened because of unknown entanglement between the two systems. The panelists — which included representatives from SMUD, the Sacramento Regional County Sanitation District, security companies, and others — discussed how OT systems can be protected:


How To Run Your Small Business With Free Open Source Software

Even if you want to stick with a closed source operating system (or, the case of macOS, partially closed source), your business can still take advantage of a vast amount of open source software. The most attractive benefit of doing so: It's generally available to download and run for nothing. While support usually isn't available for such free software, it's frequently offered at an additional cost by the author or a third party. It may be included in a low-cost commercially licensed version as well. Is it possible, then, to run a business entirely on software that can be downloaded for free? There certainly are many options that make it possible — and many more that aren't included in this guide.


Five emerging technologies for rapid digital transformation

To get a sense of what pressures IT leaders were under and how they were dealing with them, I recently sampled just over 50 of the top practitioners in the space with a focus on what I regarded were leading organizations in their industry -- mostly large enterprise CIOs, as well as a few CTOs, CDOs, and EVPs of IT who I knew were pushing the envelope -- to better understand the IT initiatives they are focusing on to becoming more agile. By picking cutting-edge leaders at top organizations, the intent was that the data will show what they're facing and how they're dealing with it this year, in a way that gives more typical organizations time to prepare for what they'll likely face next year and beyond. Unsurprisingly, the data clearly that top IT leaders are feeling much more pressure for their team to move quicker than they ever have in the past.


Surveys show high hopes, deep concerns about IoT

While many have high hopes for IoT, few are on their way to full deployment. The survey found 41 percent of respondents expect IoT to have a big impact on their industries within three years, affecting things like efficiency and differentiated products and services. But only 7 percent said they have a clear vision with implementation well under way. Most companies don't have everything they need to succeed in IoT, with many saying they'll need new technical skills, data integration and analytics capabilities, or even a rethinking of their business model. Thirty-one percent of the executives said their organizations face a "major skills gap" in industrial IoT. The annual developer survey co-sponsored by the open-source Eclipse IoT Working Group, IEEE IoT, Agile IoT and the IoT Council, also found growing adoption along with continuing concerns.


The Value of Exploratory Data Analysis

EDA is valuable to the data scientist to make certain that the results they produce are valid, correctly interpreted, and applicable to the desired business contexts. Outside of ensuring the delivery of technically sound results, EDA also benefits business stakeholders by confirming they are asking the right questions and not biasing the investigation with their assumptions, as well as by providing the context around the problem to make sure the potential value of the data scientist’s output can be maximized. As a bonus, EDA often leads to insights that the business stakeholder or data scientist wouldn’t even think to investigate but that can be hugely informative about the business. In this post, we will give a high level overview of what EDA typically entails and then describe three of the major ways EDA is critical to successfully model and interpret its results.



Quote for the day:


"Our leadership style is defined by who we are and what we do, not by what we say." -- Gordon Tredgold