Daily Tech Digest - March 03, 2018

New Cyber Security Style Guide helps bridge the communication gap

communication understanding executives phone diversity
Security without communication is worthless. You can scream yourself blue in the face, but if no one groks what you're saying, then you're wasting your time. Information security is an unintuitive discipline, in many ways backwards from how we think about security and power and threats in meatspace. Worse, the security community has developed its own slang over the years that deliberately excludes outsiders. All fields do this, of course, and if infosec were metalworking or plumbing or air traffic control, that would be fine and dandy. Ordinary people don't have a pressing need to understand the inner workings of those fields. The human race has moved online, and information security affects everyone now. It used to be we lived in the "real world" and "went online." Now we live online and visit the "real world." Soon even that will fade, until the only "real world" left will be quaint amusement parks that offer the unplugged experience



Companies ready to spend on IT hardware again

Companies ready to spend on IT hardware again
While undoubtedly enterprises are moving software applications from “on-premises data centers to the cloud,” that’s not the whole story, Huberty says. Currently, 21 percent of computing is accomplished in the cloud. That number will indeed rise, as we expect, and should be 44 percent by 2021. However, because enterprise cloud plans are beginning to solidify, or become less vague, firms are now ready to upgrade the IT gear they are retaining or think they’ll need. “They aren't abandoning on-premises computing. Instead, many are adopting a hybrid IT model in which applications move between a public cloud and their own internal data centers,” she explains. Other factors coming into play and contributing to the optimism, according to Morgan Stanley, include more cash being available because of tax law changes in the U.S. and advantages to depreciating equipment costs in the first year due to economic growth. A weak dollar and lower memory costs are also helping the shift.


How digital service providers should prepare for the NIS Directive

Last year, the European Commission published a draft implementation regulationfor DSPs, which Elizabeth Denham, the UK’s information commissioner, commented on. She criticised “the overly rigid parameters” of the regulation, which “may be undesirable and may lead to a failure to report incidents which nevertheless have a substantial impact on the users of the service and which should, by the nature of the impact, be considered for regulatory action”. The European Commission has since approved the final draft, and the UK government has released the findings of a public consultation on how it should implement and regulate the NIS Directive. IT Governance has also published a compliance guide. Each of these documents will help you understand where the NIS Directive fits into the cyber security landscape. DSPs will have to be particularly organised, as they are expected to define their own information security measures proportionate and appropriate to the potential risks they face.


CIOs ill-prepared for IT changes to enable digital business transformation


The Hackett Group reported that 64% of respondents lack confidence in their IT organisation’s capability to support transformation execution. This is all the more worrying given Hackett’s analysis, which predicted that in 2018, IT’s workload will increase by more than the number of full-time employees in IT. The Hackett Group suggested this would mean that IT needs a 2% productivity boost, on average, just to keep pace. However, it said the largest percentage increases in workload (5%) and IT staff (4.2%) are happening outside corporate IT. Instead, business groups appear to be investing in their own IT capabilities. Hackett’s benchmark study said: “Digital transformation goals are at least partial motivators for this, in that IT needs to help business units transform and differentiate customer experiences, locating IT resources closer to the end-customer’s facilities.”


The Irrational Exuberance That is Blockchain

In 2017, we saw some evolution on that front as blockchain platforms such as Hypeledger Fabric announced new versions closer to enterprise use and Ethereum progressed towards making these solutions perform and scale to suit enterprise needs. However, the exuberance has also led to new levels of hucksterism. For example, we have seen companies with dubious blockchain abilities add blockchain to their name or business to try to increase their stock price. In response, the U.S. Securities and Exchange Commission (SEC) said it will crack down on such companies It is critical at this stage in blockchain’s evolution that hype is recognized, and the emergent nature of the technology and its capabilities are clearly understood. ... Gartner does not expect large returns on blockchain until 2025. Which means today companies will have to try different blockchain projects to determine if there is value for them in blockchain — that is, whether there will be new revenue possibilities, cost savings or improvements in their customers’ user experience.


AI Is Now Analyzing Candidates' Facial Expressions During Video Job Interviews


Have you ever lied during a job interview? Most of us have, at least a little. But next time around, artificial intelligence may be watching your face's every move, assessing the honesty of your answers, as well as your emotions in general. It may also try to determine whether your personality is a good fit for the job. ... Applicants, who often find the company's job opportunities through Facebook or LinkedIn, can skip uploading their resumés and simply use their LinkedIn profiles if they wish. They then spend about 20 minutes playing a dozen neuroscience-based games intended to evaluate their personalities for such things as embracing or avoiding risk, to see if their personalities are a good fit for the particular job.  Then they perform a video interview, with preset questions, which they can do on a smartphone or tablet as well as a computer. That's where AI comes in, measuring their facial expressions to capture their moods and further assess their personality traits.


6 Experts Discuss How AI Will Change The Future Of Wall Street (Part 1)

6 Experts Discuss How AI Will Change The Future Of Wall Street Part 1
The technology behind AI has been around for more than 40 years, but for AI to work one needs two other ingredients: massive computing power at a reasonable price and massive amounts of data to train the AI. ... The biggest issue is the aversion of asset owners to “black box” strategies. Many consider AI as another version of algorithmic trading (to some extent this is true), and algorithmic strategies have not performed well in the past. While investors are comfortable with having AI playing an important role in many parts of their lives, they seem to prefer human judgment to AI when it comes to the investment process. Another potential obstacle is that an AI approach to trading requires a whole new organization structure for trading operations. While it is desirable to put discretionary traders in silos to reduce group thinking and correlations among traders, this approach will backfire when applied to AI trading, which requires a team effort to test thousands of strategies in order to pick the best. 


HSBC ready to do live trade finance transactions on blockchain

hsbc
It’s worth noting, however, that the technology is still a long way from commercial use, for HSBC at least. As well as developing the platform and the solution, a network must be in place so that the full transaction can be completed on the blockchain, which means on-boarding other banks, regulators, customs and all parts of the trade cycle. “We see that developing throughout the year so that in 2019, around the same time, we should be in a position to have both the network of banks, corporates and others, and the app ready to use on a wider scale,” Kroeker said. Meanwhile, the bank is hoping that its adventures in blockchain will leave it well-placed to cater for the “digital natives” in Asean, which is projected to be one of the world’s growth hubs for digital services over the coming years. The press conference was called to discuss the bank’s digital agenda in the region, which is shaping up to be an online battleground in the years to come.


10 Common Mistakes To Avoid In Fintch Software Development

Financial Technology, or FinTech, is a relatively new aspect of the financial industry, which focuses on applying technology to improve financial activities. This has the potential to open the doors to new kinds of applications and services for customers, as well as more competitive financial technology. However, like all new technologies, there are mistakes lurking. In contrast to software domains like end-user web apps or mobile application development, a software bug in FinTech may not just lead to annoyed users. In the wrong piece of software, bugs can result in hundreds of millions of dollars lost. The list below are some of the most common mistakes we see in software projects in general—and FinTech software development in particular—that you should watch out for when launching into the FinTech sector.


The future of IoT device management

internet of things
One potential vision for the future of consumer IoT – one which might be a lot more appealing to consumers - involves IoT devices whose identity and firmware are managed using a standardized process and entirely independently from the application layer service. When you buy a connected consumer IoT device, you should be able to securely associate that device’s identity with your personal identity and securely manage its software and firmware using a familiar, standardized workflow supported by all device vendors. This means that any consumer IoT device should be easily associated with any consumer IoT gateway that supports its protocols and be able to get to the device vendor’s management service. You then need a way to associate that device with any provider of application layer services that you choose. When you sign up for an application layer service, you should be able to easily allow the application to discover relevant IoT devices associated with this identity and provision them for use.



Quote for the day:


"When people talk, listen completely. Most people never listen." -- Ernest Hemingway


Daily Tech Digest - March 02, 2018

GitHub hit with the largest DDoS attack ever seen

ddos.png
GitHub explained how such an attack could generate vast amounts of traffic: "Spoofing of IP addresses allows memcached's responses to be targeted against another address, like ones used to serve GitHub.com, and send more data toward the target than needs to be sent by the unspoofed source. The vulnerability via misconfiguration described in the post is somewhat unique amongst that class of attacks because the amplification factor is up to 51,000, meaning that for each byte sent by the attacker, up to 51KB is sent toward the target," it said. GitHub said that, because of the scale of the attack, it decided to move traffic to Akamai, which could help provide additional edge network capacity. It said it is now investigating the use of its monitoring infrastructure to automate enabling DDoS mitigation providers and will continue to measure its response times to incidents like this -- with a goal of reducing mean time to recovery.



Load Testing Tool Must-Haves

One of the most dangerous moves software developers and testers can make is being lulled into a false sense of security. For example, when application features and performance levels meet expectations during pre-production, only to crash and burn when presented to real users in production. In that same vein, if your organization has any kind of performance testing strategy, chances are you're conducting load testing. However, you may not be truly emulating the real world behavior of your end users in your load tests. Realism in load tests, when overlooked, can cause a myriad of performance problems in production, and end users won't wait around. If you're not performing accurate and realistic load testing, you risk revenue loss, brand damage and diminished employee productivity. The solution: cloud-based load testing. Right off the bat, the cloud provides two major advantages to load and performance procedures that help testing teams better model realistic behavior: instant infrastructure and geographic location.


Building AI systems that work is still hard


Domain expertise, feature modeling and hundreds of thousands lines of code now can be beaten with a few hundred lines of scripting (plus a decent amount of data). As mentioned above: That means that proprietary code is no longer a defensible asset when it’s in the path of the mainstream AI train. Significant contributions are very rare. Real breakthroughs or new developments, even a new combination of the basic components, is only possible for a very limited number of researchers. This inner circle is much smaller, as you might think. Why is that? Maybe it’s rooted in its core algorithm: backpropagation. Nearly every neural network is trained by this method. The simplest form of backpropagation can be formulated in first-semester calculus — nothing sophisticated at all. In spite of this simplicity — or maybe for that very reason — in more than 50 years of an interesting and colorful history, only a few people looked behind the curtain and questioned its main architecture.


Another massive DDoS internet blackout could be coming your way

ddos attack
While older, more established companies are still more likely to host their own DNS, the emergence of cloud as infrastructure means that newer companies are outsourcing everything to the cloud, including DNS. "The concentration of DNS services into a small number of hands...exposes single points of failure that weren't present under the more distributed DNS paradigm of yesteryear (one in which enterprises most often hosted their own DNS servers onsite)," John Bowers, one of the report's co-authors, tells CSO. "The Dyn attack offers a perfect illustration of this concentration of risk--a single DDoS attack brought down a significant fraction of the internet by targeting a provider used by dozens of high profile websites and CDNs [content delivery networks]." The shocking part of this report is that despite the clear danger this concentration poses, too few enterprises have bothered to implement any secondary DNS.


Zero-Day Attacks Major Concern in Hybrid Cloud

Despite their growing reliance on containers, many businesses will continue to at least partially rely on legacy systems for years to come, he continues. Security becomes a challenge when multiple users are accessing multiple environments from multiple different locations. The biggest hybrid cloud security challenge is maintaining strong, consistent security across the enterprise data center and multiple cloud environments, says Cahill. Businesses want consistency; they want to be able to centralize policy and security controls across both. Security teams also struggle to maintain the pace of cloud, an increasingly difficult challenge as cloud continues to accelerate. It used to be that cloud adoption was slowed by security, Cahill points out. Now, containers are driven by the app development team. Security has to keep up. "One of the things we know about cloud computing in general, and about DevOps, is it's all about moving fast," he points out.


Can APIs Bridge the Gap between Banks and Fintechs?

Source & Copyright: XLMLdation
Fintech companies are forcing banks to go beyond their comfort zone, innovate and accept change as a way of staying in business. With APIs handling the translation between legacy systems and the new technologies, fintech companies can focus on providing more value to the clients instead of learning about obsolete systems. Adopting a client-centric vision helps both banks and fintech companies fulfill their goals. For example, a bank doesn’t offer its corporate clients the ability to compare their yearly financial results with the industry average, but a fintech company can make it its value proposition and, by cooperating with the bank through an API, to help them learn more about their results. For the bank, it doesn’t make sense to create such a niche service, while the fintech’s algorithm is useless without the proper big data input. International organizations and forums support this collaboration between banks and fintech companies since it brings added value to the client.


Cloud firms need $1bn datacentre investment a quarter to compete with AWS and co


“If companies can’t find at least a billion dollars per quarter for datacentre investments and back that up with an aggressive long-term management focus, then the best they can achieve is a tier-two status or a niche market position,” he said. Dinsdale’s comments coincide with the publication of Synergy’s research into how much capital expenditure (capex) the hyperscale cloud firms pumped into their operations in 2017. Its findings are based on an analysis of the capex and datacentre footprint of the world’s 24 biggest cloud and internet service firms. This reveals that the hyperscale community collectively spent $75bn in capex during 2017, which is 19% up on the previous year. Of that $75bn, $22bn was paid out in the fourth quarter alone. Most of the capex is channeled towards helping the hyperscale cloud firms expand and upgrade their datacentres, with Amazon, Apple, Facebook, Google and Microsoft name-checked by Synergy as being top five biggest spenders, accounting, in aggregate, for more than 70% of capex spend in the fourth quarter.


The Banking Industry Sorely Underestimates The Impact of Digital Disruption

Many organizations associate being a ‘Digital Bank’ with the development and deployment of their mobile banking application. Others look at the digital transformation from a sales or marketing perspective. The reality is that digital transformation goes beyond the way a financial services organization deploys their services across digital devices. Even though by 2025, more than 20 billion devices will be connected, the real power of these connections comes from the insight these connection produce. Use of this data, combined with advanced analytics, can change the level of back office automation, connectivity, decision making and existing business models. “Lacking a clear definition of digital, companies will struggle to connect digital strategy to their business, leaving them adrift in the fast-churning waters of digital adoption and change,” states McKinsey. “What’s happened with the smartphone over the past ten years should haunt everyone - since no industry will be immune.”


AI will create new jobs but skills must shift, say tech giants


“For sure there is some shift in the jobs. There’s lots of jobs which will. Think about flight attendant jobs before there was planes and commercial flights. ... So there are jobs which will be appearing of that type that are related to the AI,” he said. “I think the topic is a super important topic. How jobs and AI is related — I don’t think it’s one company or one country which can solve it alone. It’s all together we could think about this topic,” he added. “But it’s really an opportunity, it’s not a threat.” “From IBM’s perspective we firmly believe that every profession will be impacted by AI. There’s no question. We also believe that there will be more jobs created,” chimed in Bob Lord, IBM’s chief digital officer. “We also believe that there’ll be more jobs created. “I firmly believe that augmenting someone’s intelligence is going to get rid of… the mundane jobs. And allow us to rise up a level. That we haven’t been able to do before and solve some really, really hard problems.”


How to build skills that stay relevant instead of chasing the latest tech trends

Knowledge about core functions of the software would eventually be available from a broad pool of people, driving down wages unless you were willing to participate in the "arms race" of always learning the latest and greatest. What became quickly apparent was that the people who succeeded in this area were those who were the most adaptable and able to sense where the market was going, so they could retool their skillset based on what was hot at any given time. The individual who was a supply chain specialist a couple of years ago might now be an accounts payable expert, based on the demand for a particular skillset. These individuals had developed a core talent—the ability to sense where the market for this software package was going—and combined it with an ability to rapidly learn and apply the new technical elements of that software. While those focused on deepening their skills were seeing the market pass them by, the talent-focused individuals happily abandoned and changed skills in order to stay relevant.



Quote for the day:


"Leaders are more powerful role models when they learn than when they teach." -- Rosabeth Moss Kantor


Daily Tech Digest - March 01, 2018

nyc.jpg
We're thinking right now about how we can create a platform or partner with folks to create a platform that offers a truly open access environment to technologists and startups and existing companies who have smart cities projects to make this platform accessible to all of them. And in that platform create the opportunity to exchange data between them to potentially have inter-operation between them. So, what I mean is, can your payment at a parking meter tell the street light that you're there and accomplish some action? Can we have trash cans interact with other pieces of street furniture that is responsive to what is happening around it? I know those are fairly conceptual, but the idea is, can we take our position and facilitate the interaction between the agencies who are focused on, as they should be, accomplishing their independent missions? ... Some other cities are now doing some things similar and there's some conversation about a city operating system that is similar to what I'm thinking about.



Journey to the Cloud: Overcoming Security Risks

As for detective and monitoring security tools, most large IaaS vendors provide virtual networking capability, which the consultancy tapped for packet capture and analysis. PaaS vendors are used differently, but most provided detailed audit logs on user logins and actions which they needed for audit purposes. Some large IaaS vendors also provided additional monitoring alarms to help with pesky things like developers accidently dropping authentication credentials into public code repositories. One major challenge for the consultancy was dealing with different cloud environments. Some cloud vendors who have multiple offerings can have different knobs and gauges for their varying services. The consultancy’s security operations team would learn how to lock down and monitor something in one service area, only to find that things worked much differently in another.


Pizza Hut customers can now pay for meals with Mastercard Qkr mobile app


“Over the past six years we have invested over £60m in transforming our restaurants and menu, and this allows us to continue to improve the service and experience we offer our guests, as well as embracing technology, which has become so central to modern culture.” Betty DeVita, chief commercial officer at Mastercard Digital Payments & Labs, said Qkr would allow Pizza Hut to accommodate more customers without having to rush them. “By removing the headache of managing bills, it will allow their staff to focus more on service,” she added. Merchants can also add delivery and takeaway options for customers through the app, as well as targeted promotions and rewards schemes. Mastercard said organisations have been using its application programming interfaces (APIs) to create specific brand experiences for customers at the table as well.


Staff awareness is the financial industry’s biggest cybersecurity concern

The report urges CISOs to prioritize employee training regardless of their reporting structure, as employees are organizations’ first line of defense and their biggest vulnerability. “Employee training should include awareness about downloading and executing unknown applications on company assets, and in accordance with corporate policies and relevant regulations, and training employees on how to report suspicious emails and attachments,” the report says. Knowing where to begin with employee training can be tough, which is why IT Governance provides an Information Security Staff Awareness E-learning Course. This course can be deployed across your organization to help anyone involved in information security understand how to stay secure. It aims to reduce the likelihood of human error by familiarizing employees with security policies and procedures, covering topics such as password security, creating backups, information security incidents, and business continuity.


How to protect Macs from malware threats

malwareistock-857736120kaptnali.jpg
As malware threats increase in number and frequency, the next big attack could be looming just beyond the horizon. Which OS is the safest? I will give you a hint: If you believe it is Apple, that type of thinking might be what leads your Mac to be one of the next victims. Malware attacks against Apple computers have been growing exponentially and, in some cases, more than other attacks. While the threshold for these types of malware attacks has been rather low compared to its competitors, Apple's massive popularity and growing market share have shifted the focus over to its popular line of computing devices in an effort by threat actors to cash in (literally) on this growing target. Even the biggest malware attacks may have small beginnings, and threats targeting Apple devices will continue to proliferate unless users protect their devices by adhering to the following tips in conjunction with best practices for data and network security.


SaaS support challenges IT ops admins to shift gears


SaaS support doesn't introduce new problems for IT -- we've all dealt with browser plug-in support changes, internet connection issues and application upgrades. SaaS changes when these issues occur. Modern IT organizations get things done by adapting quickly, but behind the scenes, they have some notification of upgrades and changes. Testing, staff training and communications are planned out ahead of time, which dramatically lessens the disruption the changes cause to users and management. SaaS-based apps shorten the support lead time. There's also a risk that the SaaS update isn't compatible with an enterprise's setup, and there aren't viable alternatives. Prepare contingencies, and be ready to make adjustments after updates. SaaS support requires skill from IT operations. Things that once were minor systems quirks are now critical. IT staff are in a weaker position to control changes, and the safety nets in testing and preproduction don't work as they did for software hosted and managed in-house.


Is your vendor being honest about AI?

artificial intelligence ai brain virtual
“True AI is about the future. AI says, ‘I don’t know what this is, but we’ve seen something similar so we will flag it.’ Or, ‘We’ve never seen this before, it’s an anomaly, so we will flag it.’ The key difference between rules engines and AI is where they are focused. Rules are IF-THEN decisions based on past data. AI is all about recognizing anomalies simply because they are new. We are interested when the machine says, ‘I don’t know. I haven’t seen this before.’ This is when AI is the most powerful and useful.” Laurent offered, “A key way to tell the difference between AI and rules-based engines, is that a rules-based engine will never improve on its own until someone updates the rules. AI improves its accuracy the more it is used. The more you use it the better it becomes. The adaptability of the model is what makes AI work.” Yuri strongly agreed, "Rules are basically in the past. The machine [AI] can predict the future."


TiDB: Performance-tuning a distributed NewSQL database

TiDB: Performance tuning a distributed NewSQL database
TiDB is an open source, hybrid transactional/analytical processing (HTAP) database, designed to support both OLTP and OLAP scenarios. One TiDB cluster has several TiDB servers, several TiKV servers, and a group of Placement Drivers (PDs), usually three or five nodes. The TiDB server is a stateless SQL layer, the TiKV server is the key-value storage layer, and each PD is a manager component with a “god view” that is responsible for storing metadata and doing load balancing. Below is the architecture of a TiDB cluster. You can find more details on each component in the official TiDB documentation. We gather a lot of metrics inside each TiDB component. These are periodically sent to Prometheus, an open source system monitoring solution. You can easily observe the behaviors of these metrics in Grafana, an open source platform for time series analytics. If you deploy the TiDB cluster using Ansible, Prometheus and Grafana will be installed by default.


The future of work: How to thrive through IT’s latest revolution

Kim Smith, venture strategist and chief innovation officer at IBM, likens such employees to the early NASA employees portrayed in the movie Hidden Figures. Back then, “computer” was a job title, not a piece of office equipment, and it was the job held by the movie’s central characters. Then NASA acquired a mainframe capable of replacing a building full of human computers “so they taught themselves Fortran,” Smith says. To be successful in the future, your company must support, encourage and enable lifelong retooling. At IBM, it means giving people access to training and allowing them to rotate in and out of jobs and departments, she explains. “They can be in one role for a period of time, then go to something completely different.” “I think expectations are going to morph,” Burns adds. “Tech professionals need to be more forward thinking. A lot of the ones I’ve seen were order takers, and we have to get away from that world. We have to help disrupt industries rather than letting our organizations be disrupted.”


Top 10 Lessons in Building a Distributed Engineering Team


One poignant question that came up early on was: how do we communicate our core values to people who are not in the office? As it turns out, instilling the company's and teams' principles in remote employees actually is no more difficult than with local ones. We decided to bring people into the office for their first week. Additionally, we get together every quarter with the whole team for working sessions and team building activities. Culture is what you do when nobody's looking; for remote employees, that means a lot of opportunities to exercise the company culture. In our experience, we've found that shared values prevail regardless of physical location. By now you might be wondering whether a distributed workforce is actually practical, and that's a valid question. How can you guarantee a culture that fosters innovation even though employees aren’t in the same room? In the past, companies often claimed that having everyone under the same roof was the only way to innovate. Nowadays, the story has changed.



Quote for the day:


"Technology makes it possible for people to gain control over everything, except over technology" -- John Tudor


Daily Tech Digest - February 28, 2018

The questions are sometimes simple, but by no means always. Many questions can be summarized as “What is this?” However, only 2 percent call for a yes-or-no answer, and fewer than 2 percent can be answered with a number. And there are other unexpected features. It turns out that while most questions begin with the word “what,” almost a quarter begin with a much more unusual word. This is almost certainly the result of the recording process clipping the beginning of the question. But answers are often still possible. Take questions like “Sell by or use by date of this carton of milk” or “Oven set to thanks?” Both are straightforward to answer if the image provides the right information. The team also analyzed the images. More than a quarter are unsuitable for eliciting an answer, because they are not clear or do not contain the relevant info. Being able to spot these quickly and accurately would be a good start for a machine vision algorithm.



Memcached Servers Being Exploited in Huge DDoS Attacks

Security researchers have previously warned about Internet-facing Memcached servers being open to data theft and other security risks. Desler theorizes one reason why attackers have not used Memcached as an amplification vector in DDoS attacks previously is simply because they have not considered it and not because of any technical limitations. Exploiting Memcached servers is new as far real-world DDoS attacks are concerned, says Chad Seaman, senior engineer, with Akamai's Security Intelligence Response Team. "A researcher had theorized this could be done previously," Seaman says. "But as Memcached isn't meant to run on the Internet and is a LAN-scoped technology that is wide open, he thought it could really only be impactful in a LAN environment." But the use of default settings and reckless administration overall among many enterprises has resulted in a situation where literally tens of thousands of boxes running Memcached are on the public-facing Internet, Seaman says.


Firms failing to learn from cyber attacks

The survey findings suggest security inertia has infiltrated many organisations, with an inability to repel or contain cyber threats and the resultant impact on the business. This inertia is reflected in the fact that 46% of respondents said their organisation cannot prevent attackers from breaking into internal networks every time it is attempted, 36% said that administrative credentials are stored in Word or Excel documents on company PCs, and half admitted their customers’ privacy or PII (personally identifiable information) could be at risk because their data is not secured beyond the legally-required basics. The report notes that the automated processes inherent in cloud and DevOps mean that privileged accounts, credentials and secrets are being created at a prolific rate. If compromised, the report said these can give attackers a crucial jumping-off point to achieve lateral access to sensitive data across networks, data and applications or to use cloud infrastructure for illicit crypto mining activities.

While the “shift to Teal” is a more big picture view, there is an interesting perspective on self-organization in teams and organizations that states basically that organizations with self-organizing teams actually still have leaders / leadership. This perspective brings the big picture view above more in focus in individual organizations and companies. This is discussed in a book by Lex Sisney titled “Organizational Physics - The Science of Growing a Business”. Sisney proposes that in reality instead of having top-down or bottom up organization, some of the most new and adaptable organizations are actually “Design-Centric” organizations. ... So the leadership shift is not a choice of top-down or bottom-up, but rather one where the leader designs a system within the organization that allows teams to self-organize and to be empowered to deliver the organization’s objectives. If this is done well, there is little need for the leader to intervene in the organization or system because the people and teams are able to effectively lead and guide the organization themselves.


14 top tools to assess, implement, and maintain GDPR compliance

The European Union’s General Data Protection Regulation (GDPR) goes into effect in May 2018, which means that any organization doing business in or with the EU has six months from this writing to comply with the strict new privacy law. The GDPR applies to any organization holding or processing personal data of E.U. citizens, and the penalties for noncompliance can be stiff: up to €20 million (about $24 million) or 4 percent of annual global turnover, whichever is greater. Organizations must be able to identify, protect, and manage all personally identifiable information (PII) of EU residents even if those organizations are not based in the EU. Some vendors are offering tools to help you prepare for and comply with the GDPR. What follows is a representative sample of tools to assess what you need to do for compliance, implement measures to meet requirements, and maintain compliance once you reach it.

Chris Webber, a security strategist with SafeBreach, says configuration errors are one of the most frequently occurring issues with NGFWs. “Many users get tripped up if they only rely on vendor-supplied defaults,” Webber said. “A next-generation firewall can be like having a Swiss army knife on your network, but many times its features aren’t turned on, which lets attackers gain access.” Webber also noted that most vendors provide auto-migration tools to help new customers migrate from their legacy firewalls to NGFWs but that errors may occur during this process, as vendor features and architecture can vary. SafeBreach said it has discovered breach scenarios due to these policy gaps and errors resulting from assumptions about new NGFW vendor default policies and auto-migration challenges. Another issue is that many users don’t decrypt encrypted traffic like SSL, TLS, and SSH, which can become a major blind spot for customers, Webber said.

The future of every type of ambitious commercial business, whether it’s a factory making products, a bank loaning money, an IT support shop helping users, a grocery store selling goods, a law firm prepping available information for its client cases, an analyst firm producing insight… is to perform its business operations with the optimum balance of talent, so it can maximise its immediate profits, with an eye on the future to stay ahead of the competition. As soon as someone’s output is predictable, taking inputs from various sources to produce outputs, you can start to figure out how to program software and machines to perform said tasks – and computers will always be cheaper than humans, once they are functional and can do the job. So our goal has to be about furthering our abilities, not only to get the basics of our jobs done, but to immerse ourselves into helping our colleagues and bosses figure out the what next. Because if we only focus on the now, we are eventually going to render ourselves predictable and replaceable.


Virtual Private Networks: Why Their Days Are Numbered

VPNs require an array of equipment, protocols, service providers and topologies to be successfully implemented across an enterprise network – and the complexity is only perpetuated as networks grow. Purchasing the excess capacity and new Multiprotocol Label Switching (MPLS) connections needed to support effective VPNs can weigh heavily on IT budgets, while managing these networks will require greater reliance on personnel. Rather than limit the number of devices on their networks, organizations need to seek out solutions that simplify network management as companies continue embracing mobile and remote workforces. Even businesses that continue to rely on VPN or backhaul networks to protect their data need to employ a defense-in-depth approach to security, since VPNs, on their own, only offer the baseline protections of a standard web proxy.  As more solutions move to the cloud and enterprises rely less and less on physical servers and network connections, the need for VPNs will eventually evolve, if not disappear altogether.

From a security standpoint, what you really want is to be alerted when employees do something suspicious. User behavior analytics (UBA) are a smarter way to sniff out anomalies in users' actions and flag them for further investigation. Companies like IBM and Varonis have developed advanced UBA tools that can detect unusual activity. Is an employee trying to access a file they shouldn’t? Maybe they’re downloading something at 3:00am from a location that isn’t their home. Perhaps they’re trying to move laterally between systems. The beauty of UBA is that it highlights malicious insiders and outsiders using stolen credentials equally well, though it may require further investigation to determine which is which. If you’re going to go to the trouble of monitoring your employees, then maybe you should extract more value from the data you collect. There’s a new breed of software that offers the same potential security protections to ensure compliance but focuses on the end user experience and how it might be improved to remediate issues as they happen.

Monitoring the state of an application is important during development and in production. With a monolithic application, this is rather straightforward, since one can attach a native debugger to the process and have the ability to get a complete picture of the state of the application and its evolution. Monitoring a microservice-based application poses a greater challenge, particularly when the application is composed of tens or hundreds of microservices. Due to the fact that any request may involve being processed by many microservices running multiple times -- potentially on different servers -- it is exceptionally difficult to follow the “story” of the application and identify the causes of problems when they arise. Currently, the main methodology relies on obtaining a trace of all transactions and dependencies using tools that, for example, implement the OpenTracing standard. These tools capture timing, events, and tags, and collect this data out-of-band (asynchronously). 



Quote for the day:

"The mark of a great man is one who knows when to set aside the important things in order to accomplish the vital ones." -- Brandon Sanderson

Daily Tech Digest - February 27, 2018

Visual Studio Code joins the Anaconda Python data science toolkit

Visual Studio Code joins the Anaconda Python data science toolkit
Microsoft’s relationship with Anaconda is intended to go further than Anaconda using R Open and Visual Studio Code. It’s also working with Anaconda to embed its data science tools inside SQL Server. Bringing interactive analytics tooling into the heart of a database is a sensible approach; and Microsoft has already started to put its own analytic tools there. But making that service dependent on an open source project that it doesn’t control is a big step forward for Microsoft. SQL Server is one of its flagship enterprise products, so bringing in a set of tools that update on a very different schedule could be an issue for many of Microsoft’s corporate customers. But with Anaconda a popular tool on data scientists’ desktops, it shouldn’t be too much of a stretch for users. If you don’t need it in a production database, you can always not install it, leaving the SQL Server/Anaconda combination for your data science team’s development environment.



7 transportation IoT predictions from Cisco

7 transportation IoT predictions from Cisco
While many observers note that IoT technology evolves much faster than the vehicles and infrastructure they power, Connor had an opposite viewpoint. “In fact," he said, “the IoT data collected and analyzed from connected cars and infrastructures can help extend the life of these vehicles and the transportation system through predictive analytics and preventative maintenance. For example, by aggregating and analyzing traffic data from IoT sensors on streetlights, transportation agencies can determine which roads are most frequently traveled and service them first. "Additionally, connected cars can alert drivers when maintenance is needed to keep the vehicles running smoothly. And with vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) connections optimizing routes, alleviating congestion and helping drivers avoid road hazards, there will be fewer accidents.”


Cryptojacking is the new malware

Serving as the gateway to the Internet, browsers have gotten sophisticated over the years – and so have the hackers. Utilizing easily accessible JavaScript libraries, hackers can inconspicuously inject code into even the most secure websites. When a user visits these infiltrated websites, they are unknowingly running extra bits of code that enable hackers to utilize their device as part of a larger cryptomining initiative. In several notable examples, companies like mining-software library Coinhive, dubbing itself as an alternative to ad-blocking technology, have had their scripts illicitly embedded on websites from Showtime television network to the Ecuadorian Papa John’s Pizza. Covert or overt, drive-by mining schemes are often invisible to users, yet the implications for the enterprise can be severe. Slower performing computers can hamper productivity while the scripts running in the background can provide an open doorway for future malware or ransomware attacks.


Think Like An Attacker And Mitigate Cyber Threat


Crucially, the way that businesses often measure or prioritise their activity in terms of security is whether they will pass an audit. While it may be comforting to members of the business to meet these requirements, they often fall short of industry best practice, and significantly so. And let’s be honest – a hacker has no interest in whether an organisation has passed an audit, and neither will any customers impacted by a breach. On the one hand, meeting regulatory guidelines is often a good starting point for putting in place a sensible approach to security and data. However, simply ticking the box of compliance could well open organisations up to a range of threats. Instead, by ensuring that basic procedures are in place, organisations can build a more comprehensive strategy. This encompasses all the elements needed to support a more complex IT infrastructure and the flexibility to adapt to future changes in the IT landscape.


Making AI software smarter by adding human feedback


While more natural and human-based training does have incredible potential, it’s difficult to imagine this form of AI being used in business-centric processes such as the collection or analyzation of current intelligence. You could not hope to trust a novice or “growing” system with such highly-sensitive systems — or could you?It begs the question: What can IT professionals do to better incorporate AI into business intelligence processes so that it delivers safe, guaranteed results Avanade is a merger between Microsoft and Accenture, powered by the Cortana Intelligence Suite, meant to provide predictive analytics and data-based insights. Because it utilizes Cortana — Microsoft’s version of modern AI and voice assistant technology — it already benefits greatly from the existing platform. It hasn’t been done yet, but if Microsoft and Cortana’s developers were to introduce a form of human-based training for the platform, that information could be fed back into other areas of the technology, such as Avenade’s.


Security leaders investing in automation and AI, study shows


Applying machine learning can help to enhance network security defences and, over time, “learn” how to automatically detect unusual patterns in encrypted web traffic, cloud and internet of things (IoT) environments, the report said, adding that although they are still in their infancy, machine learning and AI technologies will mature. “Last year’s evolution of malware shows that adversaries are becoming wiser at exploiting undefended gaps in security,” said John Stewart, senior vice-president, and chief security and trust officer at Cisco. “Like never before, defenders need to make strategic security improvements, technology investments, and incorporate best practices to reduce exposure to emerging risks.” However, the Cisco report coincided with a report by UK and US experts that warned that AI is also likely to be used by attackers, who are expected to not only use the technology to increase the effectiveness of attacks, but also to exploit weaknesses in AI technologies by poisoning data, for example.



How to get more women in IT jobs? Mandate an inclusive culture

It was a good and timely question -- especially the last part. Revelations about sexual harassment and cultural breakdown were trickling out of one of Silicon Valley's standouts -- ride-sharing pioneer Uber -- leading, eventually, to the resignation of its chief executive, Travis Kalanick. But does the answer to how to get more women in IT jobs and then ensure the workplace is a safe and welcoming one for them always depend on the CEO? I took up the topic with Kristi Riordan, COO at the Flatiron School, a coding boot camp in New York that offers scholarships to women who want to be part of the high-paying tech economy. To cultivate a good environment for women in technology, organizations need to sign onto that policy at the top, Riordan said. Senior leaders must be expected to establish an inclusive culture of respect and transparency.


What is a data scientist? A key data analytics role and a lucrative career

data science classes math
A data scientist’s main objective is to organize and analyze large amounts of data, often using software specifically designed for the task. The final results of a data scientist’s analysis needs to be easy enough for all invested stakeholders to understand — especially those working outside of IT. A data scientist’s approach to data analysis depends on their industry and the specific needs of the business or department they are working for. Before a data scientist can find meaning in structured or unstructured data, business leaders and department managers must communicate what they’re looking for. As such, a data scientist must have enough business domain expertise to translate company or departmental goals into data-based deliverables such as prediction engines, pattern detection analysis, optimization algorithms, and the like.


India ranks 47th when it comes to inclusive Internet

Across the indexed countries, on average, men are 33.5 per cent more likely to have Internet access than women. "The gap is even larger in low-income countries, which have an average gender access gap of 80.2 per cent compared with 3.7 per cent among high-income countries," said Molly Jackman, Public Policy Research Manager at Facebook. The index assessed a country's Internet inclusion across four categories: availability, affordability, relevance and readiness. "Bringing people online can offer life-changing opportunities, but there are still approximately 3.8 billion people without Internet access. At Facebook, we're working to change that," added Robert Pepper, Head of Global Connectivity Policy at Facebook, in a blog post. "Global connectivity has increased 8.3 per cent and more people are connected than ever before. While this progress is encouraging, we are still far from achieving full Internet inclusivity," Pepper added.


Lenovo introduces new water-cooled server technology

Lenovo introduces new water-cooled server technology
Not only is it a cheaper method of cooling, but it’s more effective. Air cooling is only effective up to about 10 kilowatts of power in a server chassis, while water cooling can handle 70 kW or more. And the ThinkSystem SD650 is one seriously dense server tray. Each tray has two sockets, and up to 12 trays can be squeezed into one 6U NeXtScale n1200 enclosure. That translates to 24 Xeons, 9.2TB of memory, 24 SFF SSDs or 12 SFF NVMe drives, and 24 M.2 boot drives. Lenovo developed the cooling system with the Leibniz Supercomputing Center (LRZ) in Germany. Later this year, the center will deploy a 100 rack supercomputer consisting of 6,500 ThinkSystems SD650s with 26.7 petaflops of peak performance. That would make it the number three supercomputer on the Top500 supercomputer list as of November 2017, but there will undoubtedly be other contenders. The direct-water cooled design allows for up to 90 percent heat recovery, meaning only 10 percent of the heat generated by the CPU has to be addressed with an air conditioner or fan.



Quote for the day:


"He who rejects change is the architect of decay." -- Harold Wilson


Daily Tech Digest - February 26, 2018

The organisations have also developed a series of joint initiatives, which are still in their early stages and in the process of being launched. One of these is a cyber security working group, which will bring together industry representatives with NHS Digital. The working group has three initiatives that are now in the planning phase. These are: TechUK promoting NHS Digital’s hunt for a partner organisation to expand its security operations centre; setting an innovation challenge for suppliers to create a mechanism to trace data back to the original source; and “to assist NHS Digital to baseline the level of cyber security of medical devices”.  The partnership will also undertake a review of NHS Digital’s domains within the Personal Health and Care 2020 framework to find a “common view of the best way to engage with the market at an early stage” and establish governance groups for each domain.


BYOG (Bring Your Own Glasses) Will Bring Headaches For IT

vsplevel
We're facing the prospect of many or most employees carrying semi-concealed sensor bundles that connect either via Bluetooth, Wi-Fi or cellular networks and that track location. It will be difficult or impossible to know which sensors and components are built into which glasses. And in any event, the banishment of these sensor bundles will be extremely difficult, since they're also required for vision and therefore the basic performance of employees' jobs. You can ask meeting attendees or R&D visitors to leave their phones in a box outside, but you can't do that with glasses. In addition to threats to trade secrets and heightened exposure to hacking, there will be new issues with illicit recordings and captured data between and among employees and by partners, customers and others. Nobody has the answers to these challenges. But companies that want to stay ahead of the game need to start figuring out solutions sooner rather than later.


6 Cybersecurity Trends to Watch

Most breaches we see target traditional apps and on-premises environments, not the cloud infrastructure itself. Think Target, Yahoo, and JP Morgan Chase. To date, no cloud application or cloud vulnerability has been the direct source of a cataclysmic breach, and we don't envision this changing anytime soon. In analyzing more than 2.2 million verified security incidents captured in the Alert Logic network intrusion detection system over an 18-month period, the public cloud accounted for, on average, 405 incidents per customer. This was significantly lower than incidents occurring in on-premises environments (612 per customer), hosted private clouds (684), and hybrid cloud environments (977). While the Spectre and Meltdown vulnerabilities didn't bypass cloud deployments, the impact is likely to be disruption from necessary patching and subsequent performance issues. We're unlikely to see a major breach attributed to Spectre and Meltdown because they are unlikely to be used as initial attack vectors.


10 tips for crafting highly effective job descriptions

10 tips for crafting highly effective job descriptions
Hiring great talent starts with attracting the right talent. Here, an effective, engaging and inclusive job description is key. With a little upfront effort, you can craft just the right job description to bring a wide range of highly talented candidates into your pipeline — and ensure you’re not turning off talent before they even apply. "The best job descriptions combine a little bit of marketing, the reality of the role, the necessary skills and competencies and the organization's culture. All those things put together are key to how to present an open role to the market," says Justin Cerilli, managing director of financial services and technology at Russell Reynolds and Associates, an executive search and leadership transition firm. In addition to the standard role description and skills and experience required, recruiters and hiring managers must place an emphasis on culture, mission and values to avoid making a bad hire.


Google’s self-training AI turns coders into machine-learning masters

“We need to scale AI out to more people,” Fei-Fei Li, chief scientist at Google Cloud, said ahead of the launch today. Li estimates there are at most a few thousand people worldwide with the expertise needed to build the very best deep-learning models. “But there are an estimated 21 million developers worldwide today,” she says. “We want to reach out to them all, and make AI accessible to these developers.” Cloud computing is one of the keys to making AI more accessible. Google, Amazon, Microsoft, and other companies are rushing to add machine-learning capabilities to their cloud platforms. Google Cloud already offers many such tools, but they use pretrained models. That limits what they can do—for example, programmers will only be able to use the tools to recognize a limited range of objects or scenes that they have already been trained to recognize. A new generation of cloud-based machine-learning tools that can train themselves would make the technology far more versatile and easier to use.


How companies can predict new tech disruption and fight back against it

istock-687784558.jpg
While many people hear the term "disruption" and immediately think Amazon and Uber, industry-changing companies that tap tech advancements are now a reality across all business sectors, according to a Monday report from Accenture. Of 3,600 companies surveyed across 82 countries, with annual revenues of at least $100 million, 63% said they currently face high levels of disruption, the report found. ... Instead, it has a pattern that businesses can identify and prepare to combat. "Disruption is continual and inevitable — but it's also predictable," Omar Abbosh, Accenture's chief strategy officer, said in a press release. "Business leaders need to determine where their company is positioned in this disruption landscape and the likely speed of change. The more clearly they see what's changing around them, the better they can predict and identify opportunities to create value from innovation for their business and rotate to the 'new.'"


Surveillance watchdog investigates security risks of GCHQ IT contractors


For those determined enough, there are always ways to smuggle data out, from photographing a computer screen using an iPod with a built-in camera, or inserting a device known as a Teensy, which can bypass USB blocking technology by masquerading as a computer keyboard. ... Such controls may irrelevant, however, if contractors are able to access GCHQ's operational IT system remotely from the offices of an IT supplier, or even from home. Depending on the security of the computer systems they are using, it could be much easier to download and remove sensitive data. On this matter, GCHQ has so far appears to have had little to say in public. Why GCHQ is focusing almost exclusively on the security of its command line interfaces in its evidence is difficult to understand. One explanation may be that the organisation does not feel sufficiently confident about the systems it has in place to monitor the activities of its systems administrators


Global megatrends that are problematic for the state of cybersecurity

“Our hope is that CISOs and senior leaders can use this report as a tool to start a deep dialogue about the critical need for cybersecurity within their organizations,” said Raytheon Chairman and CEO Thomas A. Kennedy. “Every day the cyber threat is growing more sophisticated and aggressive, posing a real threat to global businesses across all sectors. To reduce risks, leaders must urgently work with their IT teams to identify potential vulnerabilities, develop an action plan and make the investments needed to protect the value of their organization.” The study looks at how cyber trends have evolved since 2015. It also asks security professionals in the U.S., Europe, Middle East and North Africa to identify future trends over the next three years. ... Senior leadership are also seen as seemingly disengaged in the oversight of their organization’s cybersecurity strategy with 68% of CISO/IT executives surveyed saying their Boards are not being briefed on measures taken to prevent or mitigate the consequences of a cyberattack.


Enabling Better Risk Mitigation with Threat Intelligence

A well-implemented threat intelligence capability can help improve your organization's situational awareness, threat responsiveness and ability to detect threats. Market research firm Markets & Markets estimates the market for threat intelligence services will top $8.9 billion by 2022 from around $3.8 billion in 2017. Threat intelligence is available from a variety of sources and includes IOCs, malware hashes, listings of bad URLs and files, threat actor TTPs, incident reports, exploits and targets. You can get threat intelligence via free open source feeds, paid commercial services, from peer organizations, from sector-specific information sharing groups, even newsletters, emails and spreadsheets. In order to benefit from threat intelligence, you need to be able to operationalize it. That means you need to have systems and processes in place for consuming external threat intelligence and correlating it with data from your internal systems.


What Those Developers Really Mean

What those developers really mean
Developers love to tout their new favorite toy by saying it’s the “new standard” or “it’s quickly becoming the new standard.” Again, “standard” becomes a touchstone that’s meant to make everyone feel good about the choice. The word “new,” however, should raise the hair on the back of your neck. Standards don’t become standards without time. If something is “new” then it’s too early to know whether the crowds will gather behind the bandwagon or your company will be one of the few left out to dry. Developers of the “new standard” may be blowing all the right horns and lighting lots of fireworks, but we won’t know whether the parade will fall into line without time. That doesn’t mean developers don’t have good intentions when they tell you it’s a “new standard” that they’re hot to adopt. After all, this often means they are interested in abandoning or deprecating some old approach.



Quote for the day:


"Sprints must be long enough to complete Stories, but short enough so that the reqmts churn is slower than the Sprint length can accommodate." -- @JamesSaliba