Daily Tech Digest - March 04, 2018

To Find IT Talent, Think Differently – and Move Quickly

Image: Shutterstock
With so much rapid change in technology, workers at all levels will need to expect dynamic shifts in their career development, including changing jobs every five or six years, Hatfield said. "Workers need to be geared to lifelong learning where they maneuver and re-invent," he said. That need implies that workers (and even HR and training departments) "develop a capacity for long-term learning and a passion for it, which is more important than any [one] skill set." Companies also need to focus more carefully on the actual technology they will need — everything from databases to IoT microcontrollers, Finke said. A CTO must help figure out who the lieutenants will be to oversee such technology, then determine where the company will invest and which back-end infrastructure and other technology are needed. More broadly, companies need leaders at the highest levels – not only CIOs – who understand business trends and the context and implications of potential technology disruption.


Microsoft Directly Challenges MongoDB and Cassandra with Cosmos DB

Given that all of the third-party databases listed above are free/open source, Microsoft has to offer something more than just hosting. Otherwise customers will switch back as soon as someone else offers a compatible cloud solution with better performance and/or lower prices. This is where Microsoft's other Azure products come into play. Cosmos DB can be integrated with open source products such as Apache Spark or Apache Kafka as well as proprietary products such as Azure Search, Azure Data Factory, and HDInsight. Rather than extending the file format, Microsoft is attempting to extend what you can do with the database. While switching from MongoDB's cloud hosting to Cosmos DB is mostly a QA and operations question, the use of other Azure products can put significant limitations on your future architectural options. 


 What is natural language processing? The business benefits of NLP explained

What is natural language processing? The business benefits of NLP explained
In addition to helping companies process data, sentiment analysis also helps us understand society. Periscopic, for example, has paired NLP with visual recognition to create the Trump-Emoticoaster, a data engine that processes language and facial expressions in order to monitor President Donald Trump’s emotional state. Similar tech could also prevent school shootings: At Columbia University, researchers have processed 2 million Tweets posted by 9,000 at-risk youth, looking for the answer to one question: How does language change as a teen comes closer and closer to getting violent? “Problematic content can evolve over time,” says program director Dr. Desmond Patton. As at-risk youth grow closer to the brink, they reach out for help, using language. Natural language processing then flags problematic emotional states so that social workers can intervene. Like Periscopic, Columbia pairs sentiment analysis with image recognition to improve accuracy.


Anthem CIO: How agile helped us drive value

AI artificial intelligence
One of the ways we are getting past this issue is by bringing in a set of executives and hosting a Shark Tank day. Various innovation project leaders pitch their ideas to the executives – who are the large budget holders in the company – so that they can roll their product out into the business. It’s been amazingly successful because it has required both our corporate functions and our field business units to partner together in innovation. It’s been a great way to prioritize the most important ideas and get everybody excited about participating in them, and it gets people involved and engaged. We also publish everything we’re working on to an innovation microsite that everyone has access to. The entire workforce sees what we’re working on in the Studio via video demos. Additionally, we invite everyone in the company to participate in ideating around a certain topic using collaboration tools. It’s another way we are doing more to engage the entire workforce, which helps ensure innovation isn’t getting stuck in the studio.


How to decide if open source or proprietary software solutions are best for your business

Initial skepticism regarding free software and questions about the business model ("Why would programmers work for free?") have led to steadfast enterprise adoption of open source software, with an array of options such as "completely free," "free to a certain number of users/functions" and "free but with paid support licenses." As someone who has administered hundreds of Linux servers (which run Red Hat via paid support subscriptions, although it's worth pointing out that CentOS is a totally free alternative with largely the same code base) I can attest to the benefits that open source has provided both to organizations and the technology realm in general. Without it the internet would be a far different place; much more limited, expensive, less robust, less feature-driven and less scalable. Big name companies would be much less powerful and successful as well in the absence of open source software. There's something to be said for proprietary software as well, however; it also has a rich history of providing many proven benefits to organizations.


The Smart City Ecosystem Framework – A Model for Planning Smart Cities

the smart city ecosystem framework
The smart city is a complex ecosystem of people, processes, policies, technology and other enablers working together to deliver a set of outcomes. The smart city is not “owned” exclusively by the city. Other value creators are also involved, sometimes working in collaboration and sometimes by themselves. Successful and sustainable smart cities take a programmatic approach to engage its stakeholders across the ecosystem. Our research has found that many cities are not taking an ecosystem approach to smart city projects. This is due in part to smart city projects being managed by the Information Technology (IT) organization where their charter is on systems development and deployment. In contrast, more experienced smart cities manage their smart city programs through internal cross functional “Transformation” or “Innovation” organizations. Regardless of where cities are in their smart city journey, they must get ahead of the “curve” with smart city projects.


Google’s Cross-Platform Mobile UI Framework Flutter Now in Beta

Flutter supports a reactive-style approach to UI definition, similar to React Native. What sets it apart from other cross-platform Web view-based frameworks is its reliance on Dart to avoid the need for a JavaScript bridge between the UI and the native services provided by the OS platform. This includes, for example, location services, sensor access, camera, etc. By using Dart, which is compiled ahead-of-time into native code, Flutter does not pay the cost of context switching due to the JavaScript bridge. Cross-platform frameworks that aim to provide a native UI look and feel also use natively-implemented widgets to represent buttons, tables, etc. This also usually requires different parts of an app to communicate using the JavaScript bridge, which tends to be slow. To circumvent this, Flutter provides its own collection of widgets and draws them directly on the canvas provided by the OS platform.


Powerful New DDoS Method Adds Extortion


Because memcached doesn’t support authentication, an attacker can “spoof” or fake the Internet address of the machine making that request so that the memcached servers responding to the request all respond to the spoofed address — the intended target of the DDoS attack. Worse yet, memcached has a unique ability to take a small amount of attack traffic and amplify it into a much bigger threat. Most popular DDoS tactics that abuse UDP connections can amplify the attack traffic 10 or 20 times — allowing, for example a 1 mb file request to generate a response that includes between 10mb and 20mb of traffic. But with memcached, an attacker can force the response to be thousands of times the size of the request. All of the responses get sent to the target specified in the spoofed request, and it requires only a small number of open memcached servers to create huge attacks using very few resources. Akamai believes there are currently more than 50,000 known memcached systems exposed to the Internet that can be leveraged at a moment’s notice to aid in massive DDoS attacks.


Manufacturing & Innovation

The Advanced Technologies Initiative provides important insights on US and global innovation trends, and highlights the challenges faced by businesses in maintaining or improving their technology competitiveness. In addition, Deloitte and the Council have consolidated the interviewees’ thoughts and perspectives to develop a set of high-priority recommendations detailing immediate and longterm critical needs to improve the national innovation ecosystem vital to sustaining US competitiveness. The study aims to increase attention and discussion on the current US science and technology system and pinpoint deficits to address its vitality. An ancillary aim is to spur an ongoing national dialogue among stakeholders on advanced technologies, industries, and foci of research from a systematic, versus siloed, perspective. The report captures the voices and opinions of both government and industry leaders on US and global R&D, as well as innovation, trends.


This year banking changes for good. Are you ready for the revolution?

EY Digital Passport
It’s certainly a timely move. On the consumer side, privacy and personal data security are becoming an increasing concern. On the business side, the free flow of data has the potential to enable institutions to innovate, creating competition, and more choice for consumers and small to medium enterprises (SMEs). Historically, SMEs have often been seen as costly for banks to service in relation to the size of business. This has led to them having limited services to choose from, with access to credit being an ongoing concern. As competition increases from FinTech startups and challenger banks, there is potential for institutions who overcome these issues to take market share and better serve the market – generating benefits for themselves and their customers. As you may expect with a GDPR linked initiative, privacy and trust are at the heart of open banking — particularly important when you consider heightened sensitivity around what happens to our personal data when we hand it over.



Quot for the day:


"A leader does not deserve the name unless he is willing occasionally to stand alone." -- Henry A. Kissinge


Daily Tech Digest - March 03, 2018

New Cyber Security Style Guide helps bridge the communication gap

communication understanding executives phone diversity
Security without communication is worthless. You can scream yourself blue in the face, but if no one groks what you're saying, then you're wasting your time. Information security is an unintuitive discipline, in many ways backwards from how we think about security and power and threats in meatspace. Worse, the security community has developed its own slang over the years that deliberately excludes outsiders. All fields do this, of course, and if infosec were metalworking or plumbing or air traffic control, that would be fine and dandy. Ordinary people don't have a pressing need to understand the inner workings of those fields. The human race has moved online, and information security affects everyone now. It used to be we lived in the "real world" and "went online." Now we live online and visit the "real world." Soon even that will fade, until the only "real world" left will be quaint amusement parks that offer the unplugged experience



Companies ready to spend on IT hardware again

Companies ready to spend on IT hardware again
While undoubtedly enterprises are moving software applications from “on-premises data centers to the cloud,” that’s not the whole story, Huberty says. Currently, 21 percent of computing is accomplished in the cloud. That number will indeed rise, as we expect, and should be 44 percent by 2021. However, because enterprise cloud plans are beginning to solidify, or become less vague, firms are now ready to upgrade the IT gear they are retaining or think they’ll need. “They aren't abandoning on-premises computing. Instead, many are adopting a hybrid IT model in which applications move between a public cloud and their own internal data centers,” she explains. Other factors coming into play and contributing to the optimism, according to Morgan Stanley, include more cash being available because of tax law changes in the U.S. and advantages to depreciating equipment costs in the first year due to economic growth. A weak dollar and lower memory costs are also helping the shift.


How digital service providers should prepare for the NIS Directive

Last year, the European Commission published a draft implementation regulationfor DSPs, which Elizabeth Denham, the UK’s information commissioner, commented on. She criticised “the overly rigid parameters” of the regulation, which “may be undesirable and may lead to a failure to report incidents which nevertheless have a substantial impact on the users of the service and which should, by the nature of the impact, be considered for regulatory action”. The European Commission has since approved the final draft, and the UK government has released the findings of a public consultation on how it should implement and regulate the NIS Directive. IT Governance has also published a compliance guide. Each of these documents will help you understand where the NIS Directive fits into the cyber security landscape. DSPs will have to be particularly organised, as they are expected to define their own information security measures proportionate and appropriate to the potential risks they face.


CIOs ill-prepared for IT changes to enable digital business transformation


The Hackett Group reported that 64% of respondents lack confidence in their IT organisation’s capability to support transformation execution. This is all the more worrying given Hackett’s analysis, which predicted that in 2018, IT’s workload will increase by more than the number of full-time employees in IT. The Hackett Group suggested this would mean that IT needs a 2% productivity boost, on average, just to keep pace. However, it said the largest percentage increases in workload (5%) and IT staff (4.2%) are happening outside corporate IT. Instead, business groups appear to be investing in their own IT capabilities. Hackett’s benchmark study said: “Digital transformation goals are at least partial motivators for this, in that IT needs to help business units transform and differentiate customer experiences, locating IT resources closer to the end-customer’s facilities.”


The Irrational Exuberance That is Blockchain

In 2017, we saw some evolution on that front as blockchain platforms such as Hypeledger Fabric announced new versions closer to enterprise use and Ethereum progressed towards making these solutions perform and scale to suit enterprise needs. However, the exuberance has also led to new levels of hucksterism. For example, we have seen companies with dubious blockchain abilities add blockchain to their name or business to try to increase their stock price. In response, the U.S. Securities and Exchange Commission (SEC) said it will crack down on such companies It is critical at this stage in blockchain’s evolution that hype is recognized, and the emergent nature of the technology and its capabilities are clearly understood. ... Gartner does not expect large returns on blockchain until 2025. Which means today companies will have to try different blockchain projects to determine if there is value for them in blockchain — that is, whether there will be new revenue possibilities, cost savings or improvements in their customers’ user experience.


AI Is Now Analyzing Candidates' Facial Expressions During Video Job Interviews


Have you ever lied during a job interview? Most of us have, at least a little. But next time around, artificial intelligence may be watching your face's every move, assessing the honesty of your answers, as well as your emotions in general. It may also try to determine whether your personality is a good fit for the job. ... Applicants, who often find the company's job opportunities through Facebook or LinkedIn, can skip uploading their resumés and simply use their LinkedIn profiles if they wish. They then spend about 20 minutes playing a dozen neuroscience-based games intended to evaluate their personalities for such things as embracing or avoiding risk, to see if their personalities are a good fit for the particular job.  Then they perform a video interview, with preset questions, which they can do on a smartphone or tablet as well as a computer. That's where AI comes in, measuring their facial expressions to capture their moods and further assess their personality traits.


6 Experts Discuss How AI Will Change The Future Of Wall Street (Part 1)

6 Experts Discuss How AI Will Change The Future Of Wall Street Part 1
The technology behind AI has been around for more than 40 years, but for AI to work one needs two other ingredients: massive computing power at a reasonable price and massive amounts of data to train the AI. ... The biggest issue is the aversion of asset owners to “black box” strategies. Many consider AI as another version of algorithmic trading (to some extent this is true), and algorithmic strategies have not performed well in the past. While investors are comfortable with having AI playing an important role in many parts of their lives, they seem to prefer human judgment to AI when it comes to the investment process. Another potential obstacle is that an AI approach to trading requires a whole new organization structure for trading operations. While it is desirable to put discretionary traders in silos to reduce group thinking and correlations among traders, this approach will backfire when applied to AI trading, which requires a team effort to test thousands of strategies in order to pick the best. 


HSBC ready to do live trade finance transactions on blockchain

hsbc
It’s worth noting, however, that the technology is still a long way from commercial use, for HSBC at least. As well as developing the platform and the solution, a network must be in place so that the full transaction can be completed on the blockchain, which means on-boarding other banks, regulators, customs and all parts of the trade cycle. “We see that developing throughout the year so that in 2019, around the same time, we should be in a position to have both the network of banks, corporates and others, and the app ready to use on a wider scale,” Kroeker said. Meanwhile, the bank is hoping that its adventures in blockchain will leave it well-placed to cater for the “digital natives” in Asean, which is projected to be one of the world’s growth hubs for digital services over the coming years. The press conference was called to discuss the bank’s digital agenda in the region, which is shaping up to be an online battleground in the years to come.


10 Common Mistakes To Avoid In Fintch Software Development

Financial Technology, or FinTech, is a relatively new aspect of the financial industry, which focuses on applying technology to improve financial activities. This has the potential to open the doors to new kinds of applications and services for customers, as well as more competitive financial technology. However, like all new technologies, there are mistakes lurking. In contrast to software domains like end-user web apps or mobile application development, a software bug in FinTech may not just lead to annoyed users. In the wrong piece of software, bugs can result in hundreds of millions of dollars lost. The list below are some of the most common mistakes we see in software projects in general—and FinTech software development in particular—that you should watch out for when launching into the FinTech sector.


The future of IoT device management

internet of things
One potential vision for the future of consumer IoT – one which might be a lot more appealing to consumers - involves IoT devices whose identity and firmware are managed using a standardized process and entirely independently from the application layer service. When you buy a connected consumer IoT device, you should be able to securely associate that device’s identity with your personal identity and securely manage its software and firmware using a familiar, standardized workflow supported by all device vendors. This means that any consumer IoT device should be easily associated with any consumer IoT gateway that supports its protocols and be able to get to the device vendor’s management service. You then need a way to associate that device with any provider of application layer services that you choose. When you sign up for an application layer service, you should be able to easily allow the application to discover relevant IoT devices associated with this identity and provision them for use.



Quote for the day:


"When people talk, listen completely. Most people never listen." -- Ernest Hemingway


Daily Tech Digest - March 02, 2018

GitHub hit with the largest DDoS attack ever seen

ddos.png
GitHub explained how such an attack could generate vast amounts of traffic: "Spoofing of IP addresses allows memcached's responses to be targeted against another address, like ones used to serve GitHub.com, and send more data toward the target than needs to be sent by the unspoofed source. The vulnerability via misconfiguration described in the post is somewhat unique amongst that class of attacks because the amplification factor is up to 51,000, meaning that for each byte sent by the attacker, up to 51KB is sent toward the target," it said. GitHub said that, because of the scale of the attack, it decided to move traffic to Akamai, which could help provide additional edge network capacity. It said it is now investigating the use of its monitoring infrastructure to automate enabling DDoS mitigation providers and will continue to measure its response times to incidents like this -- with a goal of reducing mean time to recovery.



Load Testing Tool Must-Haves

One of the most dangerous moves software developers and testers can make is being lulled into a false sense of security. For example, when application features and performance levels meet expectations during pre-production, only to crash and burn when presented to real users in production. In that same vein, if your organization has any kind of performance testing strategy, chances are you're conducting load testing. However, you may not be truly emulating the real world behavior of your end users in your load tests. Realism in load tests, when overlooked, can cause a myriad of performance problems in production, and end users won't wait around. If you're not performing accurate and realistic load testing, you risk revenue loss, brand damage and diminished employee productivity. The solution: cloud-based load testing. Right off the bat, the cloud provides two major advantages to load and performance procedures that help testing teams better model realistic behavior: instant infrastructure and geographic location.


Building AI systems that work is still hard


Domain expertise, feature modeling and hundreds of thousands lines of code now can be beaten with a few hundred lines of scripting (plus a decent amount of data). As mentioned above: That means that proprietary code is no longer a defensible asset when it’s in the path of the mainstream AI train. Significant contributions are very rare. Real breakthroughs or new developments, even a new combination of the basic components, is only possible for a very limited number of researchers. This inner circle is much smaller, as you might think. Why is that? Maybe it’s rooted in its core algorithm: backpropagation. Nearly every neural network is trained by this method. The simplest form of backpropagation can be formulated in first-semester calculus — nothing sophisticated at all. In spite of this simplicity — or maybe for that very reason — in more than 50 years of an interesting and colorful history, only a few people looked behind the curtain and questioned its main architecture.


Another massive DDoS internet blackout could be coming your way

ddos attack
While older, more established companies are still more likely to host their own DNS, the emergence of cloud as infrastructure means that newer companies are outsourcing everything to the cloud, including DNS. "The concentration of DNS services into a small number of hands...exposes single points of failure that weren't present under the more distributed DNS paradigm of yesteryear (one in which enterprises most often hosted their own DNS servers onsite)," John Bowers, one of the report's co-authors, tells CSO. "The Dyn attack offers a perfect illustration of this concentration of risk--a single DDoS attack brought down a significant fraction of the internet by targeting a provider used by dozens of high profile websites and CDNs [content delivery networks]." The shocking part of this report is that despite the clear danger this concentration poses, too few enterprises have bothered to implement any secondary DNS.


Zero-Day Attacks Major Concern in Hybrid Cloud

Despite their growing reliance on containers, many businesses will continue to at least partially rely on legacy systems for years to come, he continues. Security becomes a challenge when multiple users are accessing multiple environments from multiple different locations. The biggest hybrid cloud security challenge is maintaining strong, consistent security across the enterprise data center and multiple cloud environments, says Cahill. Businesses want consistency; they want to be able to centralize policy and security controls across both. Security teams also struggle to maintain the pace of cloud, an increasingly difficult challenge as cloud continues to accelerate. It used to be that cloud adoption was slowed by security, Cahill points out. Now, containers are driven by the app development team. Security has to keep up. "One of the things we know about cloud computing in general, and about DevOps, is it's all about moving fast," he points out.


Can APIs Bridge the Gap between Banks and Fintechs?

Source & Copyright: XLMLdation
Fintech companies are forcing banks to go beyond their comfort zone, innovate and accept change as a way of staying in business. With APIs handling the translation between legacy systems and the new technologies, fintech companies can focus on providing more value to the clients instead of learning about obsolete systems. Adopting a client-centric vision helps both banks and fintech companies fulfill their goals. For example, a bank doesn’t offer its corporate clients the ability to compare their yearly financial results with the industry average, but a fintech company can make it its value proposition and, by cooperating with the bank through an API, to help them learn more about their results. For the bank, it doesn’t make sense to create such a niche service, while the fintech’s algorithm is useless without the proper big data input. International organizations and forums support this collaboration between banks and fintech companies since it brings added value to the client.


Cloud firms need $1bn datacentre investment a quarter to compete with AWS and co


“If companies can’t find at least a billion dollars per quarter for datacentre investments and back that up with an aggressive long-term management focus, then the best they can achieve is a tier-two status or a niche market position,” he said. Dinsdale’s comments coincide with the publication of Synergy’s research into how much capital expenditure (capex) the hyperscale cloud firms pumped into their operations in 2017. Its findings are based on an analysis of the capex and datacentre footprint of the world’s 24 biggest cloud and internet service firms. This reveals that the hyperscale community collectively spent $75bn in capex during 2017, which is 19% up on the previous year. Of that $75bn, $22bn was paid out in the fourth quarter alone. Most of the capex is channeled towards helping the hyperscale cloud firms expand and upgrade their datacentres, with Amazon, Apple, Facebook, Google and Microsoft name-checked by Synergy as being top five biggest spenders, accounting, in aggregate, for more than 70% of capex spend in the fourth quarter.


The Banking Industry Sorely Underestimates The Impact of Digital Disruption

Many organizations associate being a ‘Digital Bank’ with the development and deployment of their mobile banking application. Others look at the digital transformation from a sales or marketing perspective. The reality is that digital transformation goes beyond the way a financial services organization deploys their services across digital devices. Even though by 2025, more than 20 billion devices will be connected, the real power of these connections comes from the insight these connection produce. Use of this data, combined with advanced analytics, can change the level of back office automation, connectivity, decision making and existing business models. “Lacking a clear definition of digital, companies will struggle to connect digital strategy to their business, leaving them adrift in the fast-churning waters of digital adoption and change,” states McKinsey. “What’s happened with the smartphone over the past ten years should haunt everyone - since no industry will be immune.”


AI will create new jobs but skills must shift, say tech giants


“For sure there is some shift in the jobs. There’s lots of jobs which will. Think about flight attendant jobs before there was planes and commercial flights. ... So there are jobs which will be appearing of that type that are related to the AI,” he said. “I think the topic is a super important topic. How jobs and AI is related — I don’t think it’s one company or one country which can solve it alone. It’s all together we could think about this topic,” he added. “But it’s really an opportunity, it’s not a threat.” “From IBM’s perspective we firmly believe that every profession will be impacted by AI. There’s no question. We also believe that there will be more jobs created,” chimed in Bob Lord, IBM’s chief digital officer. “We also believe that there’ll be more jobs created. “I firmly believe that augmenting someone’s intelligence is going to get rid of… the mundane jobs. And allow us to rise up a level. That we haven’t been able to do before and solve some really, really hard problems.”


How to build skills that stay relevant instead of chasing the latest tech trends

Knowledge about core functions of the software would eventually be available from a broad pool of people, driving down wages unless you were willing to participate in the "arms race" of always learning the latest and greatest. What became quickly apparent was that the people who succeeded in this area were those who were the most adaptable and able to sense where the market was going, so they could retool their skillset based on what was hot at any given time. The individual who was a supply chain specialist a couple of years ago might now be an accounts payable expert, based on the demand for a particular skillset. These individuals had developed a core talent—the ability to sense where the market for this software package was going—and combined it with an ability to rapidly learn and apply the new technical elements of that software. While those focused on deepening their skills were seeing the market pass them by, the talent-focused individuals happily abandoned and changed skills in order to stay relevant.



Quote for the day:


"Leaders are more powerful role models when they learn than when they teach." -- Rosabeth Moss Kantor


Daily Tech Digest - March 01, 2018

nyc.jpg
We're thinking right now about how we can create a platform or partner with folks to create a platform that offers a truly open access environment to technologists and startups and existing companies who have smart cities projects to make this platform accessible to all of them. And in that platform create the opportunity to exchange data between them to potentially have inter-operation between them. So, what I mean is, can your payment at a parking meter tell the street light that you're there and accomplish some action? Can we have trash cans interact with other pieces of street furniture that is responsive to what is happening around it? I know those are fairly conceptual, but the idea is, can we take our position and facilitate the interaction between the agencies who are focused on, as they should be, accomplishing their independent missions? ... Some other cities are now doing some things similar and there's some conversation about a city operating system that is similar to what I'm thinking about.



Journey to the Cloud: Overcoming Security Risks

As for detective and monitoring security tools, most large IaaS vendors provide virtual networking capability, which the consultancy tapped for packet capture and analysis. PaaS vendors are used differently, but most provided detailed audit logs on user logins and actions which they needed for audit purposes. Some large IaaS vendors also provided additional monitoring alarms to help with pesky things like developers accidently dropping authentication credentials into public code repositories. One major challenge for the consultancy was dealing with different cloud environments. Some cloud vendors who have multiple offerings can have different knobs and gauges for their varying services. The consultancy’s security operations team would learn how to lock down and monitor something in one service area, only to find that things worked much differently in another.


Pizza Hut customers can now pay for meals with Mastercard Qkr mobile app


“Over the past six years we have invested over £60m in transforming our restaurants and menu, and this allows us to continue to improve the service and experience we offer our guests, as well as embracing technology, which has become so central to modern culture.” Betty DeVita, chief commercial officer at Mastercard Digital Payments & Labs, said Qkr would allow Pizza Hut to accommodate more customers without having to rush them. “By removing the headache of managing bills, it will allow their staff to focus more on service,” she added. Merchants can also add delivery and takeaway options for customers through the app, as well as targeted promotions and rewards schemes. Mastercard said organisations have been using its application programming interfaces (APIs) to create specific brand experiences for customers at the table as well.


Staff awareness is the financial industry’s biggest cybersecurity concern

The report urges CISOs to prioritize employee training regardless of their reporting structure, as employees are organizations’ first line of defense and their biggest vulnerability. “Employee training should include awareness about downloading and executing unknown applications on company assets, and in accordance with corporate policies and relevant regulations, and training employees on how to report suspicious emails and attachments,” the report says. Knowing where to begin with employee training can be tough, which is why IT Governance provides an Information Security Staff Awareness E-learning Course. This course can be deployed across your organization to help anyone involved in information security understand how to stay secure. It aims to reduce the likelihood of human error by familiarizing employees with security policies and procedures, covering topics such as password security, creating backups, information security incidents, and business continuity.


How to protect Macs from malware threats

malwareistock-857736120kaptnali.jpg
As malware threats increase in number and frequency, the next big attack could be looming just beyond the horizon. Which OS is the safest? I will give you a hint: If you believe it is Apple, that type of thinking might be what leads your Mac to be one of the next victims. Malware attacks against Apple computers have been growing exponentially and, in some cases, more than other attacks. While the threshold for these types of malware attacks has been rather low compared to its competitors, Apple's massive popularity and growing market share have shifted the focus over to its popular line of computing devices in an effort by threat actors to cash in (literally) on this growing target. Even the biggest malware attacks may have small beginnings, and threats targeting Apple devices will continue to proliferate unless users protect their devices by adhering to the following tips in conjunction with best practices for data and network security.


SaaS support challenges IT ops admins to shift gears


SaaS support doesn't introduce new problems for IT -- we've all dealt with browser plug-in support changes, internet connection issues and application upgrades. SaaS changes when these issues occur. Modern IT organizations get things done by adapting quickly, but behind the scenes, they have some notification of upgrades and changes. Testing, staff training and communications are planned out ahead of time, which dramatically lessens the disruption the changes cause to users and management. SaaS-based apps shorten the support lead time. There's also a risk that the SaaS update isn't compatible with an enterprise's setup, and there aren't viable alternatives. Prepare contingencies, and be ready to make adjustments after updates. SaaS support requires skill from IT operations. Things that once were minor systems quirks are now critical. IT staff are in a weaker position to control changes, and the safety nets in testing and preproduction don't work as they did for software hosted and managed in-house.


Is your vendor being honest about AI?

artificial intelligence ai brain virtual
“True AI is about the future. AI says, ‘I don’t know what this is, but we’ve seen something similar so we will flag it.’ Or, ‘We’ve never seen this before, it’s an anomaly, so we will flag it.’ The key difference between rules engines and AI is where they are focused. Rules are IF-THEN decisions based on past data. AI is all about recognizing anomalies simply because they are new. We are interested when the machine says, ‘I don’t know. I haven’t seen this before.’ This is when AI is the most powerful and useful.” Laurent offered, “A key way to tell the difference between AI and rules-based engines, is that a rules-based engine will never improve on its own until someone updates the rules. AI improves its accuracy the more it is used. The more you use it the better it becomes. The adaptability of the model is what makes AI work.” Yuri strongly agreed, "Rules are basically in the past. The machine [AI] can predict the future."


TiDB: Performance-tuning a distributed NewSQL database

TiDB: Performance tuning a distributed NewSQL database
TiDB is an open source, hybrid transactional/analytical processing (HTAP) database, designed to support both OLTP and OLAP scenarios. One TiDB cluster has several TiDB servers, several TiKV servers, and a group of Placement Drivers (PDs), usually three or five nodes. The TiDB server is a stateless SQL layer, the TiKV server is the key-value storage layer, and each PD is a manager component with a “god view” that is responsible for storing metadata and doing load balancing. Below is the architecture of a TiDB cluster. You can find more details on each component in the official TiDB documentation. We gather a lot of metrics inside each TiDB component. These are periodically sent to Prometheus, an open source system monitoring solution. You can easily observe the behaviors of these metrics in Grafana, an open source platform for time series analytics. If you deploy the TiDB cluster using Ansible, Prometheus and Grafana will be installed by default.


The future of work: How to thrive through IT’s latest revolution

Kim Smith, venture strategist and chief innovation officer at IBM, likens such employees to the early NASA employees portrayed in the movie Hidden Figures. Back then, “computer” was a job title, not a piece of office equipment, and it was the job held by the movie’s central characters. Then NASA acquired a mainframe capable of replacing a building full of human computers “so they taught themselves Fortran,” Smith says. To be successful in the future, your company must support, encourage and enable lifelong retooling. At IBM, it means giving people access to training and allowing them to rotate in and out of jobs and departments, she explains. “They can be in one role for a period of time, then go to something completely different.” “I think expectations are going to morph,” Burns adds. “Tech professionals need to be more forward thinking. A lot of the ones I’ve seen were order takers, and we have to get away from that world. We have to help disrupt industries rather than letting our organizations be disrupted.”


Top 10 Lessons in Building a Distributed Engineering Team


One poignant question that came up early on was: how do we communicate our core values to people who are not in the office? As it turns out, instilling the company's and teams' principles in remote employees actually is no more difficult than with local ones. We decided to bring people into the office for their first week. Additionally, we get together every quarter with the whole team for working sessions and team building activities. Culture is what you do when nobody's looking; for remote employees, that means a lot of opportunities to exercise the company culture. In our experience, we've found that shared values prevail regardless of physical location. By now you might be wondering whether a distributed workforce is actually practical, and that's a valid question. How can you guarantee a culture that fosters innovation even though employees aren’t in the same room? In the past, companies often claimed that having everyone under the same roof was the only way to innovate. Nowadays, the story has changed.



Quote for the day:


"Technology makes it possible for people to gain control over everything, except over technology" -- John Tudor


Daily Tech Digest - February 28, 2018

The questions are sometimes simple, but by no means always. Many questions can be summarized as “What is this?” However, only 2 percent call for a yes-or-no answer, and fewer than 2 percent can be answered with a number. And there are other unexpected features. It turns out that while most questions begin with the word “what,” almost a quarter begin with a much more unusual word. This is almost certainly the result of the recording process clipping the beginning of the question. But answers are often still possible. Take questions like “Sell by or use by date of this carton of milk” or “Oven set to thanks?” Both are straightforward to answer if the image provides the right information. The team also analyzed the images. More than a quarter are unsuitable for eliciting an answer, because they are not clear or do not contain the relevant info. Being able to spot these quickly and accurately would be a good start for a machine vision algorithm.



Memcached Servers Being Exploited in Huge DDoS Attacks

Security researchers have previously warned about Internet-facing Memcached servers being open to data theft and other security risks. Desler theorizes one reason why attackers have not used Memcached as an amplification vector in DDoS attacks previously is simply because they have not considered it and not because of any technical limitations. Exploiting Memcached servers is new as far real-world DDoS attacks are concerned, says Chad Seaman, senior engineer, with Akamai's Security Intelligence Response Team. "A researcher had theorized this could be done previously," Seaman says. "But as Memcached isn't meant to run on the Internet and is a LAN-scoped technology that is wide open, he thought it could really only be impactful in a LAN environment." But the use of default settings and reckless administration overall among many enterprises has resulted in a situation where literally tens of thousands of boxes running Memcached are on the public-facing Internet, Seaman says.


Firms failing to learn from cyber attacks

The survey findings suggest security inertia has infiltrated many organisations, with an inability to repel or contain cyber threats and the resultant impact on the business. This inertia is reflected in the fact that 46% of respondents said their organisation cannot prevent attackers from breaking into internal networks every time it is attempted, 36% said that administrative credentials are stored in Word or Excel documents on company PCs, and half admitted their customers’ privacy or PII (personally identifiable information) could be at risk because their data is not secured beyond the legally-required basics. The report notes that the automated processes inherent in cloud and DevOps mean that privileged accounts, credentials and secrets are being created at a prolific rate. If compromised, the report said these can give attackers a crucial jumping-off point to achieve lateral access to sensitive data across networks, data and applications or to use cloud infrastructure for illicit crypto mining activities.

While the “shift to Teal” is a more big picture view, there is an interesting perspective on self-organization in teams and organizations that states basically that organizations with self-organizing teams actually still have leaders / leadership. This perspective brings the big picture view above more in focus in individual organizations and companies. This is discussed in a book by Lex Sisney titled “Organizational Physics - The Science of Growing a Business”. Sisney proposes that in reality instead of having top-down or bottom up organization, some of the most new and adaptable organizations are actually “Design-Centric” organizations. ... So the leadership shift is not a choice of top-down or bottom-up, but rather one where the leader designs a system within the organization that allows teams to self-organize and to be empowered to deliver the organization’s objectives. If this is done well, there is little need for the leader to intervene in the organization or system because the people and teams are able to effectively lead and guide the organization themselves.


14 top tools to assess, implement, and maintain GDPR compliance

The European Union’s General Data Protection Regulation (GDPR) goes into effect in May 2018, which means that any organization doing business in or with the EU has six months from this writing to comply with the strict new privacy law. The GDPR applies to any organization holding or processing personal data of E.U. citizens, and the penalties for noncompliance can be stiff: up to €20 million (about $24 million) or 4 percent of annual global turnover, whichever is greater. Organizations must be able to identify, protect, and manage all personally identifiable information (PII) of EU residents even if those organizations are not based in the EU. Some vendors are offering tools to help you prepare for and comply with the GDPR. What follows is a representative sample of tools to assess what you need to do for compliance, implement measures to meet requirements, and maintain compliance once you reach it.

Chris Webber, a security strategist with SafeBreach, says configuration errors are one of the most frequently occurring issues with NGFWs. “Many users get tripped up if they only rely on vendor-supplied defaults,” Webber said. “A next-generation firewall can be like having a Swiss army knife on your network, but many times its features aren’t turned on, which lets attackers gain access.” Webber also noted that most vendors provide auto-migration tools to help new customers migrate from their legacy firewalls to NGFWs but that errors may occur during this process, as vendor features and architecture can vary. SafeBreach said it has discovered breach scenarios due to these policy gaps and errors resulting from assumptions about new NGFW vendor default policies and auto-migration challenges. Another issue is that many users don’t decrypt encrypted traffic like SSL, TLS, and SSH, which can become a major blind spot for customers, Webber said.

The future of every type of ambitious commercial business, whether it’s a factory making products, a bank loaning money, an IT support shop helping users, a grocery store selling goods, a law firm prepping available information for its client cases, an analyst firm producing insight… is to perform its business operations with the optimum balance of talent, so it can maximise its immediate profits, with an eye on the future to stay ahead of the competition. As soon as someone’s output is predictable, taking inputs from various sources to produce outputs, you can start to figure out how to program software and machines to perform said tasks – and computers will always be cheaper than humans, once they are functional and can do the job. So our goal has to be about furthering our abilities, not only to get the basics of our jobs done, but to immerse ourselves into helping our colleagues and bosses figure out the what next. Because if we only focus on the now, we are eventually going to render ourselves predictable and replaceable.


Virtual Private Networks: Why Their Days Are Numbered

VPNs require an array of equipment, protocols, service providers and topologies to be successfully implemented across an enterprise network – and the complexity is only perpetuated as networks grow. Purchasing the excess capacity and new Multiprotocol Label Switching (MPLS) connections needed to support effective VPNs can weigh heavily on IT budgets, while managing these networks will require greater reliance on personnel. Rather than limit the number of devices on their networks, organizations need to seek out solutions that simplify network management as companies continue embracing mobile and remote workforces. Even businesses that continue to rely on VPN or backhaul networks to protect their data need to employ a defense-in-depth approach to security, since VPNs, on their own, only offer the baseline protections of a standard web proxy.  As more solutions move to the cloud and enterprises rely less and less on physical servers and network connections, the need for VPNs will eventually evolve, if not disappear altogether.

From a security standpoint, what you really want is to be alerted when employees do something suspicious. User behavior analytics (UBA) are a smarter way to sniff out anomalies in users' actions and flag them for further investigation. Companies like IBM and Varonis have developed advanced UBA tools that can detect unusual activity. Is an employee trying to access a file they shouldn’t? Maybe they’re downloading something at 3:00am from a location that isn’t their home. Perhaps they’re trying to move laterally between systems. The beauty of UBA is that it highlights malicious insiders and outsiders using stolen credentials equally well, though it may require further investigation to determine which is which. If you’re going to go to the trouble of monitoring your employees, then maybe you should extract more value from the data you collect. There’s a new breed of software that offers the same potential security protections to ensure compliance but focuses on the end user experience and how it might be improved to remediate issues as they happen.

Monitoring the state of an application is important during development and in production. With a monolithic application, this is rather straightforward, since one can attach a native debugger to the process and have the ability to get a complete picture of the state of the application and its evolution. Monitoring a microservice-based application poses a greater challenge, particularly when the application is composed of tens or hundreds of microservices. Due to the fact that any request may involve being processed by many microservices running multiple times -- potentially on different servers -- it is exceptionally difficult to follow the “story” of the application and identify the causes of problems when they arise. Currently, the main methodology relies on obtaining a trace of all transactions and dependencies using tools that, for example, implement the OpenTracing standard. These tools capture timing, events, and tags, and collect this data out-of-band (asynchronously). 



Quote for the day:

"The mark of a great man is one who knows when to set aside the important things in order to accomplish the vital ones." -- Brandon Sanderson

Daily Tech Digest - February 27, 2018

Visual Studio Code joins the Anaconda Python data science toolkit

Visual Studio Code joins the Anaconda Python data science toolkit
Microsoft’s relationship with Anaconda is intended to go further than Anaconda using R Open and Visual Studio Code. It’s also working with Anaconda to embed its data science tools inside SQL Server. Bringing interactive analytics tooling into the heart of a database is a sensible approach; and Microsoft has already started to put its own analytic tools there. But making that service dependent on an open source project that it doesn’t control is a big step forward for Microsoft. SQL Server is one of its flagship enterprise products, so bringing in a set of tools that update on a very different schedule could be an issue for many of Microsoft’s corporate customers. But with Anaconda a popular tool on data scientists’ desktops, it shouldn’t be too much of a stretch for users. If you don’t need it in a production database, you can always not install it, leaving the SQL Server/Anaconda combination for your data science team’s development environment.



7 transportation IoT predictions from Cisco

7 transportation IoT predictions from Cisco
While many observers note that IoT technology evolves much faster than the vehicles and infrastructure they power, Connor had an opposite viewpoint. “In fact," he said, “the IoT data collected and analyzed from connected cars and infrastructures can help extend the life of these vehicles and the transportation system through predictive analytics and preventative maintenance. For example, by aggregating and analyzing traffic data from IoT sensors on streetlights, transportation agencies can determine which roads are most frequently traveled and service them first. "Additionally, connected cars can alert drivers when maintenance is needed to keep the vehicles running smoothly. And with vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) connections optimizing routes, alleviating congestion and helping drivers avoid road hazards, there will be fewer accidents.”


Cryptojacking is the new malware

Serving as the gateway to the Internet, browsers have gotten sophisticated over the years – and so have the hackers. Utilizing easily accessible JavaScript libraries, hackers can inconspicuously inject code into even the most secure websites. When a user visits these infiltrated websites, they are unknowingly running extra bits of code that enable hackers to utilize their device as part of a larger cryptomining initiative. In several notable examples, companies like mining-software library Coinhive, dubbing itself as an alternative to ad-blocking technology, have had their scripts illicitly embedded on websites from Showtime television network to the Ecuadorian Papa John’s Pizza. Covert or overt, drive-by mining schemes are often invisible to users, yet the implications for the enterprise can be severe. Slower performing computers can hamper productivity while the scripts running in the background can provide an open doorway for future malware or ransomware attacks.


Think Like An Attacker And Mitigate Cyber Threat


Crucially, the way that businesses often measure or prioritise their activity in terms of security is whether they will pass an audit. While it may be comforting to members of the business to meet these requirements, they often fall short of industry best practice, and significantly so. And let’s be honest – a hacker has no interest in whether an organisation has passed an audit, and neither will any customers impacted by a breach. On the one hand, meeting regulatory guidelines is often a good starting point for putting in place a sensible approach to security and data. However, simply ticking the box of compliance could well open organisations up to a range of threats. Instead, by ensuring that basic procedures are in place, organisations can build a more comprehensive strategy. This encompasses all the elements needed to support a more complex IT infrastructure and the flexibility to adapt to future changes in the IT landscape.


Making AI software smarter by adding human feedback


While more natural and human-based training does have incredible potential, it’s difficult to imagine this form of AI being used in business-centric processes such as the collection or analyzation of current intelligence. You could not hope to trust a novice or “growing” system with such highly-sensitive systems — or could you?It begs the question: What can IT professionals do to better incorporate AI into business intelligence processes so that it delivers safe, guaranteed results Avanade is a merger between Microsoft and Accenture, powered by the Cortana Intelligence Suite, meant to provide predictive analytics and data-based insights. Because it utilizes Cortana — Microsoft’s version of modern AI and voice assistant technology — it already benefits greatly from the existing platform. It hasn’t been done yet, but if Microsoft and Cortana’s developers were to introduce a form of human-based training for the platform, that information could be fed back into other areas of the technology, such as Avenade’s.


Security leaders investing in automation and AI, study shows


Applying machine learning can help to enhance network security defences and, over time, “learn” how to automatically detect unusual patterns in encrypted web traffic, cloud and internet of things (IoT) environments, the report said, adding that although they are still in their infancy, machine learning and AI technologies will mature. “Last year’s evolution of malware shows that adversaries are becoming wiser at exploiting undefended gaps in security,” said John Stewart, senior vice-president, and chief security and trust officer at Cisco. “Like never before, defenders need to make strategic security improvements, technology investments, and incorporate best practices to reduce exposure to emerging risks.” However, the Cisco report coincided with a report by UK and US experts that warned that AI is also likely to be used by attackers, who are expected to not only use the technology to increase the effectiveness of attacks, but also to exploit weaknesses in AI technologies by poisoning data, for example.



How to get more women in IT jobs? Mandate an inclusive culture

It was a good and timely question -- especially the last part. Revelations about sexual harassment and cultural breakdown were trickling out of one of Silicon Valley's standouts -- ride-sharing pioneer Uber -- leading, eventually, to the resignation of its chief executive, Travis Kalanick. But does the answer to how to get more women in IT jobs and then ensure the workplace is a safe and welcoming one for them always depend on the CEO? I took up the topic with Kristi Riordan, COO at the Flatiron School, a coding boot camp in New York that offers scholarships to women who want to be part of the high-paying tech economy. To cultivate a good environment for women in technology, organizations need to sign onto that policy at the top, Riordan said. Senior leaders must be expected to establish an inclusive culture of respect and transparency.


What is a data scientist? A key data analytics role and a lucrative career

data science classes math
A data scientist’s main objective is to organize and analyze large amounts of data, often using software specifically designed for the task. The final results of a data scientist’s analysis needs to be easy enough for all invested stakeholders to understand — especially those working outside of IT. A data scientist’s approach to data analysis depends on their industry and the specific needs of the business or department they are working for. Before a data scientist can find meaning in structured or unstructured data, business leaders and department managers must communicate what they’re looking for. As such, a data scientist must have enough business domain expertise to translate company or departmental goals into data-based deliverables such as prediction engines, pattern detection analysis, optimization algorithms, and the like.


India ranks 47th when it comes to inclusive Internet

Across the indexed countries, on average, men are 33.5 per cent more likely to have Internet access than women. "The gap is even larger in low-income countries, which have an average gender access gap of 80.2 per cent compared with 3.7 per cent among high-income countries," said Molly Jackman, Public Policy Research Manager at Facebook. The index assessed a country's Internet inclusion across four categories: availability, affordability, relevance and readiness. "Bringing people online can offer life-changing opportunities, but there are still approximately 3.8 billion people without Internet access. At Facebook, we're working to change that," added Robert Pepper, Head of Global Connectivity Policy at Facebook, in a blog post. "Global connectivity has increased 8.3 per cent and more people are connected than ever before. While this progress is encouraging, we are still far from achieving full Internet inclusivity," Pepper added.


Lenovo introduces new water-cooled server technology

Lenovo introduces new water-cooled server technology
Not only is it a cheaper method of cooling, but it’s more effective. Air cooling is only effective up to about 10 kilowatts of power in a server chassis, while water cooling can handle 70 kW or more. And the ThinkSystem SD650 is one seriously dense server tray. Each tray has two sockets, and up to 12 trays can be squeezed into one 6U NeXtScale n1200 enclosure. That translates to 24 Xeons, 9.2TB of memory, 24 SFF SSDs or 12 SFF NVMe drives, and 24 M.2 boot drives. Lenovo developed the cooling system with the Leibniz Supercomputing Center (LRZ) in Germany. Later this year, the center will deploy a 100 rack supercomputer consisting of 6,500 ThinkSystems SD650s with 26.7 petaflops of peak performance. That would make it the number three supercomputer on the Top500 supercomputer list as of November 2017, but there will undoubtedly be other contenders. The direct-water cooled design allows for up to 90 percent heat recovery, meaning only 10 percent of the heat generated by the CPU has to be addressed with an air conditioner or fan.



Quote for the day:


"He who rejects change is the architect of decay." -- Harold Wilson