Daily Tech Digest - September 28, 2017

Professor Harish Bhaskaran of Oxford, who led the team, said “The development of computers that work more like the human brain has been a holy grail of scientists for decades. Via a network of neurons and synapses the brain can process and store vast amounts of information simultaneously, using only a few tons of Watts of power. Conventional computers can’t come close to this sort of performance.” Daniel C. Wright, a co-author from the Exeter team, added that “Electronic computers are relatively slow, and the faster we make them the more they consume. Conventional computers are also pretty ‘dumb,’ with none of the in-built learning and parallel processing capabilities of the human brain. We tackle both of these issues here — not only by developing not only new brain-like computer architectures, but also by working in the optical domain to leverage the huge speed and power advantages of the upcoming silicon photonics revolution.”


Before you deploy OpenStack, address cost, hybrid cloud issues

Training can become an indirect OpenStack cost. IT and developer staff may not have the requisite skill sets needed to tackle an OpenStack deployment. You may need to find more OpenStack-savvy staff to handle the job, spend the money to train up existing staff as Certified OpenStack Administrators, hire consultants to jump-start the work or some combination of these tactics. Consider the implications of OpenStack support. Organizations can certainly adopt a canned OpenStack distribution and associated support from vendors like Red Hat or Rackspace. As open source software acquired directly, however, there is no official support. If you choose to deploy OpenStack, assemble a suite of support resources to address inevitable questions or to resolve problems. Some resources are free, while other resources will incur added costs.


To combat phishing, you must change your approach

To combat phishing, you must change your approach
The threat surface is growing, and cybercriminals are becoming more sophisticated. They’re utilizing threat tactics that have made it increasingly difficult for organizations to protect themselves at scale. Cyber criminals are putting pressure on businesses by increasing the volume of these kinds of targeted attacks, dramatically outpacing even the world’s largest security teams’ ability to keep up. Visibility is sadly lacking within most of today’s organizations, and it’s unrealistic for security teams to secure something they can’t see. There’s no tool or widget that can totally fix this and make everything safe. But we can get to a point where we have the ability to construct a security program that reduces risk in a demonstrable way. We can establish metrics for where your risk profile is today.


Fintech’s future is in the back end

Fear that their money would ultimately be spent on on-premise, and therefore nonscalable, technology has been another reason investors have shied away from the opportunity. This fear arises from the tendency of institutions to want to keep a new technology “in the institution” because of security concerns. However, technology has matured enough to meet the reasonably strict security requirements banks impose on partners and vendors. Just six years ago, only 64% of global financial firms had adopted a cloud application, according to research from Temenos. But now, security has dramatically improved in cloud applications and banks are willing to adopt the technology at scale. This is evidenced in both cloud solution adoption and also the industry’s growing willingness to embrace an open banking framework.


WannaCry an example of pseudo-ransomware, says McAfee

WannaCry may have been a proof of concept, but the true propose, he said, was to cause disruption, which is consistent with what researchers are learning when going undercover as ransomware victims to ransomware support forums. “When one of our researchers asked why a particular ransom was so low, the ransomware support representative told her that those operating the ransoware had already been paid by someone to create and run the ransomware campaign to disrupt a competitor’s business,” said Samani. “The game has changed. The reality is that any organisation can hire someone to disrupt a competitor’s business operations for less than the price of a cup of coffee.” In the face of this reality, Samani said the security industry and society as a whole has to “draw a line in the sand”


The Digital Intelligence Of The World's Leading Asset Managers 2017

Where once the asset management sector was a digital desert, websites and social media channels abound. Whilst this represents genuine progress, the content and functionality within them leaves a lot to be desired in most cases. Quality search functionality is hard to find, websites resemble glorified CVs and blogs read like technical manuals. As for thought leadership, well there’s little thought and no leadership. Social media, especially Twitter and Linkedin, are swamped with relentless HR tweets and duplicate updates. It’s clear that asset managers are missing an opportunity to create content that resonates with FAIs and can build lasting two-way relationships. Over the following pages we present our findings in detail and take a closer look at the digital successes and failures within the world’s leading asset managers.


Heads in the cloud: banks inch closer to cloud take-up

On the one hand, cloud providers – such as the leader of the pack, Amazon Web Services – are likely to have security processes and technology that are at least as advanced as those of their banking clients, thanks to their technical expertise and economies of scale. On the other hand, providers can pass on a bank’s data or system management to yet another contractor, increasing security risks present in traditional outsourcing. The EU’s General Data Protection Regulation, coming into force next year, will up the ante on data security. The new rules require, among other things, that bank customers are able to request that their personal data held is deleted. One practical outcome, say lawyers, is that banks will have to clarify to cloud providers exactly how they should handle


Inside the fight for the soul of Infosys


Murthy criticized Sikka's pay and his use of private jets, and claimed that corporate governance standards had eroded during his tenure. Saying he could no longer run the company amid such criticism from a company founder, Sikka resigned as chief executive on Aug. 18 and left the board six days later. Three other directors followed him out the door, including the former chairman, R. Seshasayee. Murthy's criticisms haven't let up since Sikka's resignation. Speaking to shareholders on Aug. 29, he detailed his "concerns as a shareholder" over how the company's board members approved a severance package worth roughly 170 million rupees ($2.65 million) for former Chief Financial Officer Rajiv Bansal, who left the company in October 2015.


Should CISOs join CEOs in the C-suite?

A working partnership between the CIO and the CISO is clearly a successful formula, regardless of who reports to whom. “CISOs should report to the CEO with further exposure and responsibility to the board of directors,” says Alp Hug, founder and COO at Zenedge, a DDoS and malware protection vendor. “The time has come for boardrooms to consider cybersecurity a key requirement of every organization's core infrastructure along with a financial system, HRMS, CRM, etc., necessary to ensure the livelihood and continuity of the business.” If a board of directors says defending their organization against cyber crime and cyber warfare is a top priority, then they’ll demonstrate it by inviting their CISO into the boardroom. “Of course CISOs and equivalents will say they should report to the CEO,” says John Daniels


The ins and outs of NoSQL data modelling

Data modelling is critical to understanding data, its interrelationships, and its rules. A data model is not just documentation, because it can be forward-engineered into a physical database. In short, data modelling solves one of the biggest challenges when adopting NoSQL technology: harnessing the power and flexibility of dynamic schemas without falling in the traps that a lack of design structure can create for teams. It eases the on-boarding of NoSQL databases and legitimises the adoption in the enterprise roadmap, corporate IT architecture, and organisational data governance requirements. More specifically, it allows us to define and marry all the various contexts, ontologies, taxonomies, relationships, graphs, and models into one overarching data model.



Quote for the day:


"If you realize you aren't so wise today as you thought you were yesterday, you're wiser today." -- Olin Miller


Daiy Tech Digest - Septmber 27, 2017

Google's Pixel phone has basically been a critical success from the start. Even a year after its debut, the phone has remained a standard of comparison to which all other Android devices are held – and generally not to their benefit. That's mostly due to the Pixel's prowess in areas where other Android device-makers can't (or won't) compete. Significant as that is, though, ask an average non-tech-obsessed smartphone user what they think about the Pixel – and all you're likely to get in response is a glossy-eyed stare. Google may be positioning the Pixel as a mainstream device and even marketing it as such, to a degree, but it hasn't yet managed to break through that Samsung-scented wall and make its phone impactful in any broad and practical sense.
heart.jpg
The scan would be a type of authentication known as biometric security. Smartphone fingerprint scanners have long led the way in this market as one of the most popular methods. However, the inclusion of facial scanning in the iPhone X and some Android phones has furthered the conversation on what physical characteristics can be used to secure a computing device. In addition to being used for smartphones, the heart scan technology could be used in airport security screenings, the release said. And, for those who may be worried about potential health effects of the scans, Xu mentioned in the release that the strength of the signal "is much less than Wi-Fi," and doesn't pose a health concern. "We are living in a Wi-Fi surrounding environment every day, and the new system is as safe as those Wi-Fi devices," Xu said in the release.


Rethinking security when everyone's Social Security number is public

"The assumption that we previously held, which was that Social Security numbers and driver's license numbers are relatively private ... that's now gone," he said. "Beyond how Equifax changes credit scoring, there's a big question about how Equifax changes identity validation." This is a distinctly separate issue from fraud detection, Perret said. Bank accounts and card numbers can be shut down and reissued, but banks can't do the same for Social Security numbers and other identity factors. "On the fraud side, there's a ton of work we can do, including multifactor authentication," he said, but "the KYC requirements are pretty explicit ... so that needs to be updated." Indeed, a lot of the security practices being used today are done more out of tradition than out of effectiveness.


Another experiment with currency? RBI is looking at its own Bitcoin

The success of Bitcoin, a popular cryptocurrency, may have encouraged the central bank to consider its own cryptocurrency since it is not comfortable with this non-fiat cryptocurrency, as stated by RBI executive director Sudarshan Sen a few days ago. Despite RBI's call for caution to people against the use of virtual currencies, a domestic Bitcoin exchange said it was adding over 2,500 users a day and had reached five lakh downloads. The company, launched in 2015, said the increasing downloads highlighted the "growing acceptance of Bitcoins as one of the most popular emerging asset class."  A group of experts at RBI is examining the possibility of a fiat cryptocurrency which would become an alternative to the Indian rupee for digital transactions.  According to a media report, RBI's own Bitcoin can be named Lakshmi, after the Hindu goddess of wealth.


IT and Future Employment - Data Scientist

A tongue-in-cheek definition is, a computer scientist who knows more statistics than his or her colleagues, or a statistician who knows more computer science than his or her colleagues. Time will tell if data science will become a new discipline, or if it will remain a cross-disciplinary field between these two (and perhaps other) fields. The statistician David Donoho published a paper in 2015 with the provocative title “Fifty Years of Data Science”. He was referencing the statistician John Tukey’s call, more than 50 years ago, for statistics to expand into what we now call data science. Donoho’s paper is well worth reading. I’ll list the six subfields of data science that he identifies and make several comments about each one ... A growing demand for people trained in data science has caused the shortage of these people to balloon. Moreover, only limited opportunities to obtain such training exist today


AI is changing the skills needed of tomorrow's data scientists

AI is changing the nuts and bolts of data management, alleviating data teams from a lot of tedious, manual dirty work so that they can focus their time on creating business outcomes and allowing data scientist to work at a speed and scale that is impossible today. The data scientist of tomorrow must be prepared to work with the AI revolution, optimizing processes without losing the human ability to think creatively and apply data-driven insights to real-world problems. The next generation of data scientists will be even more necessary for helping to apply models and algorithms to problems and processes across the enterprise. For data science students, it’s not only crucial to understand the data and the technology but it’s equally as valuable to learn how to function in teams, collaborate and teach.


4 Job Skills That Can Boost Networking Salaries

The report dissects the salaries of more than 75 tech positions, including eight networking and telecommunications roles and two network-specific security roles. Among the 10 network and telecommunications roles, network architects will be paid the most in the coming year, Robert Half Technology says. ... By comparison, network architects in the 75th percentile can expect to see starting salaries of $160,750, the 50th percentile can expect $134,000, and the 25th percentile will earn $112,750, according to the guide. This is the first year that Robert Half Technology is breaking down compensation ranges by percentile in its annual salary guide. The categories are designed to help hiring managers weigh a candidate's skills, experience level, and the complexity of the role when making an offer.


Critical Network of Things: Why you must rethink your IoT security strategy

Critical Network of Things: Why you must rethink your IoT security strategy
Having made, lost and then re-made my fortune in and around the industry over the past 20-plus years, I cannot help but smile over the level of hype — and, as a certain US President would call it, “fake news” — surrounding the current world of IoT. In spite of what the media and investors would like to think, IoT is not new. I can recall building all sorts of systems including AMR (Automatic Meter Reading/ Smart Metering) networks covering whole cities, pharmaceutical storage monitoring, on-line pest/rodent trap systems, trucks and trailer tracking, foodstuff refrigeration monitoring and land subsidence monitoring, just to name a few examples. They all followed the same basic architecture as we see with today’s IoT offerings, but under the label M2M (Machine to Machine).


5 fundamental differences between Windows 10 and Linux

Although in the early years, hardware support was a serious issue and the command line was a requirement, the last five years have seen very rare occasions that I've come into a problem that couldn't be overcome. I cannot say the same thing about Windows. No matter the iteration, I've always managed to find troubling issues with the Microsoft platform. Generally speaking, those issues can be managed. The latest iteration of Windows is no exception. Coming from Windows 7, I skipped 8 and headed directly to 10. I've found going from Windows 7 to 10 akin to making the leap from GNOME 2.x to 3.x. The metaphor was quite different and took a bit of acclimation. Even though they go about it very differently, in the end, both platforms have the same goal—helping users get their work done.


The Five Steps to Building a Successful Private Cloud

Like every engineering project, setting wrong expectations and unrealistic goals will lead to poor outcome. It doesn’t have to be that way. Once you have a clear understanding of what problems you need to solve for stakeholders, you must define clear goals and requirements. For example, look at the existing pain points for your developers and how your private cloud solution will solve or mitigate those problems. Improving the developers experience ensure faster adoption and long-term success. Making the move to a private cloud require focus, perseverance, motivation, accountability, and strong communication. You must have a good understanding of your existing service costs by doing a thorough Total Cost of Ownership analysis. What does the day to day operations look like to support private infrastructure?



Quote for the day:


"If you only read the books that everyone else is reading, you can only think what everyone else is thinking." - Haruki Murakami


Daily Tech Digest - September 26, 2017


Time to embrace a security management plane in the cloud

Time to embrace a security management plane in the cloud

There’s an old saying: Change is the enemy of security. To avoid disruptive changes, many cybersecurity professionals strive for tight control of their environment, and this control extends to the management of security technologies. Experienced cybersecurity professionals often opt to install management servers and software on their networks so that management and staff “own” their technologies and can control everything they can. This type of control has long been thought of as a security best practice, so many CISOs continue to eschew an alternative model: a cloud-based security management control plane.  Given the history of cybersecurity, this behavior is certainly understandable — I control what happens on my own network, but I have almost no oversight on what takes place on Amazon Web Services (AWS), Microsoft Azure or Google Cloud Platform.


Canada's Tough New Breach Reporting Regulations

Previously in Canada, entities experiencing a breach were required to identify what kind of breach occurred and to notify regulators. "Contacting affected individuals [about the breach] would be something you would delegate to the regulators to get advice and guidance on," he says. But that all changes under the Digital Privacy Act of 2015, which amended certain Canadian privacy regulations in three key ways and will likely go into effect by the end of 2017, Ahmad says. Those changes include mandatory breach notification to affected individuals; keeping a record log for two years of any types of data breaches that occur; and imposing sanctions of up to $100,000 for each violation of the new law, he says. Those amendments provide "a bit more teeth" to Canadian data breach legal requirements, he notes.



Machine Learning Big Data

From buzzword to boardroom – what’s next for machine learning?

You only need to think of the allocation of payments to invoices, the selection of applicants in the HR area, the evaluation of marketing ROI, or forecasts of customer behaviour in e-commerce transactions. Machine learning offers great potential for companies from the big-data environment, provided they muster the necessary developer capacities to integrate machine learning into their applications. As AI moves from the future into the present, organisations not only want to gain insight into their own processes via classical process mining, they are also looking for practical support for the decision-making process, such as guidance on how to further optimise single process steps or efficiently eliminate any hurdles that still exist. By doing so, they can understand which influencing factors would be worthwhile tackling first.


Firms look to security analytics to keep pace with cyber threats

Implementing security analytics can take time and money, especially if a business is using outdated hardware and software. Gene Stevens, co-founder and CTO of enterprise security supplier ProtectWise, says many CISOs are finding it difficult to retain forensics for an extended period of time in a way that is cost-effective and easy to manage. However, his company has come up with an intelligent, analytics-oriented platform to tackle this problem. “With a memory of network activity, security teams can go back and identify whether they were compromised by an attack once it is discovered – and assess the extent of its impact,” says Stevens. However, traditional approaches are costly to scale and laborious to deploy.


New managed private cloud services hoist VMware workloads

CenturyLink's new VMware managed private cloud, CenturyLink Dedicated Cloud Compute Foundation, rearchitects its flagship private cloud onto Hewlett Packard Enterprise (HPE) hardware. It is cheaper and 50% faster to provision than its predecessor, which required multiple integration points across network, compute and virtualization from five vendors. That's typical with many private clouds that require users to coordinate technologies, either within OpenStack or earlier versions of VMware, said David Shacochis, vice president of product management at CenturyLink. VMware Cloud Foundation serves up an integrated stack with vSphere, NSX and vSAN, which means fewer moving pieces, improved self-service features and security control, Shacochis said.


Data storage in Azure: Everything you need to know

On Azure, things are different. Instead of having to manage data at an operating-system level, Azure’s object file system leaves everything up to your code. After all, you’re storing and managing the data that’s needed by only one app, so the management task is much simpler. That’s where Azure’s blob storage comes in. Blobs are binary large objects, any unstructured data you want to store. With a RESTful interface, Azure’s blob store hides much of the underlying complexity of handling files, and the Azure platform ensures that the same object is available across multiple storage replicas, using strong consistency to ensure that all versions of a write are correct before objects can be read. 



computer frustration man resized

What’s your problem? Survey uncovers top sources of IT pain

Cost-cutting has always been a focus for enterprise IT, but as the diversity and volume of data increases, complexity and storage sprawl are straining budgets. Indeed, budget challenges were a top challenge for 35% of IT pros surveyed at VMworld 2017.  A metadata engine can cut costs in several ways. First, it can automatically and transparently moves warm and cold data off high performance resources to dramatically optimize the use of an organization’s most expensive infrastructure. Second, it enables organizations to dramatically increase the utilization of their existing resources. With global visibility and control, organizations can view data location and alignment against SLAs, as well as available performance and capacity resources. This allows IT to instantly observe resources that might be nearing thresholds, and subscribe to alerts and notifications for these thresholds.


FBI's Freese Shares Risk Management Tips

Confusion over the definitions of "threat" and "risk" exist when IT security teams talk to members of the executive suite. One strategy security professionals may consider is approaching the discussion from a business perspective, instead of leading with fear, says Don Freese, deputy assistant director with the FBI's information technology branch. Freese, who served as a keynote speaker Monday at the ISC(2) Security Congress convention in Austin, Texas, noted that risks are measurable, providing that companies practice good security hygiene, such as logging network activity and taking inventory of the data that the enterprise possesses. In addition to those best practices, Freese also advises IT security leaders to consider the industry that they operate in and the type of data that would be desired by cybercriminals, or nation states.


8 Features of Fintech Apps that Appeal to Millennials

A new wave of fintech apps is set to hit smart devices over the coming years. A number of new startups, such as the digital-only bank Atom and the international money transfer service TransferWise, are going to revolutionize the financial industry. Fintech investments have grown exponentially over the last three years, and there are many opportunities for developers, investors and executives of “legacy companies” to ride on this wave. At the same time, of all the consumer demographics, Millennials are expected to be pivotal in driving changes. As a consequence, fintech developers are aggressively targeting them, tailoring apps to solve key pain points. Let’s look at some of the common features that software developers are currently including in their apps, along with those that are expected to become widespread, in order to appeal to Millennials.



Google's custom data center switch Jupiter

What’s Different about Google’s New Cloud Connectivity Service

Dedicated Interconnect is only one of a set of cloud connectivity options from Google; it’s designed for handling workloads at scale, with significantly high-bandwidth traffic of more than 2Gbps. You can also use it to link your corporate network directly to GCP’s Virtual Private Cloud private IP addresses. Taking your cloud traffic off the public internet and onto your own network range gives you more options for taking advantage of cloud services. “Networking technologies are enabling applications and data to be located in their best execution venue for that workload,” Traver noted. Like its competitors, Google Cloud Platform requires you to connect to one of several global peering locations, so as well as Google’s charges, you’re also going to need to pay your network service provider to reach Google’s peering points.



Quote for the day:

"The two most powerful warriors are patience and time." -- Tolstoy

Daily Tech Digest - September 25, 2017

Deloitte hit by cyber-attack revealing clients’ secret emails

The Guardian understands Deloitte clients across all of these sectors had material in the company email system that was breached. The companies include household names as well as US government departments. So far, six of Deloitte’s clients have been told their information was “impacted” by the hack. Deloitte’s internal review into the incident is ongoing. The Guardian understands Deloitte discovered the hack in March this year, but it is believed the attackers may have had access to its systems since October or November 2016. The hacker compromised the firm’s global email server through an “administrator’s account” that, in theory, gave them privileged, unrestricted “access to all areas”. The account required only a single password and did not have “two-step“ verification, sources said.


Let’s Not Get Physical: Get Logical

In the ideal future, there would be no programmers responsible for data movement. Instead, the data infrastructure would provide the illusion that all data is almost instantly available at the physical point of its need. Data consumers, including data analysts, would log on to a data catalog, shop for, and request the data they needed. That data would be described at a high level, with its business meaning clearly spelled out for both the human and the machine. (We call that computable meaning.) When a user requested data to be delivered to a certain point (perhaps a virtual point in the cloud), the data infrastructure would start copying the data from its origin, using replication techniques—meaning no potentially deforming transformations would be built into the data movement.



How to Survive Wall Street’s Robot Revolution

Consider the junior investment banker, who spends much of his or her time collecting and analyzing data and then creating reports. Consulting firm Kognetics found that investment-banking analysts spend upwards of 16 hours in the office a day, and almost half of that is spent on tasks like modeling and updating charts for pitch books. Machine learning, and natural language processing techniques, are already very good at this. Workers in compliance and regulation have a different worry: Over the last five years, their ranks have doubled, while overall headcount at banks declined 10 percent, according to research by Citigroup. Automating those activities — so-called regtech — could be good news for financial institutions looking to control the rising cost of compliance, and bad news for people looking to keep their jobs.


Data Governance: Just Because You Can, Doesn't Mean You Should

The impact of data use by businesses and government organizations on individuals, communities, and the environment is under constant scrutiny around the world. We are starting to see this formalized with security and privacy regulations such as the EU’s General Data Protection Regulation (GDPR) and the Privacy by Design approach for data systems. But even adhering to legal requirements and compliance regulations will not be enough to protect the business when it comes to ethical data use. Why? Ethical concerns precede legal and compliance requirements. And the stakes are large. Brand reputation is at risk. One wrong move could cause a significant loss, if not the whole business to fail.


Transforming processes with big data: Refining company turns to SAP Process Mining

A key component of the effort to improve process management is SAP Process Mining by Celonis 4.2.0, a process mining software that uses "digital traces of IT-supported processes" to reconstruct what happens in a company. The application shows all of the process variants, and it provides visualization of all currently running processes. The technology is expected to play a critical role in the effort to enhance processes, providing full transparency and analysis so the company can observe business processes directly from the vast data present in IT infrastructure systems such as its SAP enterprise resource planning (ERP) platform. Based on the analytical findings and process key performance indicators (KPIs), the company will be able to identify process improvement opportunities, Rajatora said.


From accounting to code: one woman’s journey to a career in tech

The pressure to find that first role can feel overwhelming, and often people take the first semi-suitable job they find, at the expense of their actual passions. Getting that first experience may well open the doors to something better, but it could also colour your experience of this new industry, for better or worse. As far as I was concerned, I’d had a lot of experience working with traditional banks in my previous role, and spent at least four or five hours each day attempting to complete straightforward tasks across seven banks in five different countries. This meant that fintech and its potential to transform the banking landscape felt like a very attractive prospect to me, and that Starling Bank’s mission was something I felt strongly about.


The Battle for the Cloud Has Not Even Started Yet

The real war will break out when solutions, offered via the cloud, can support business innovation and business differentiation: When cloud solutions drive business benefit directly and not benefits to IT. For that to happen we need to talk about what a business does (its business processes and decisions) and how a business operates, not what IT does and how IT operates. This might seem like a small point but in the overall scheme of things, in the overall war, I think this is a massive point. If I am lucky I might even be around long enough to be proven right (or wrong). So this is where my little framework starts to be useful. Yes, IaaS is a well-known battle field and the armies are out there fighting it out. Of the next battle fronts, PaaS and SaaS will form up. In fact they are forming up already though they are not seen as important yet by many.


Digital is a Strategic Vehicle for Business Disruption

According to the research findings, the top three success factors for customer experience transformation is: 1. customer centric culture, 2. management/leadership buy-in, and 3. visibility into and understanding of the end customer experience. The research also revealed that customer experience (CX) leaders are more likely to be using emerging technologies and creating personalized and omni-channel experiences. CX leaders are also more likely to use data to predict and anticipate consumer needs, understand lifetime value, and track customer advocacy. CX leaders also have a much higher sense of urgency - they believe there is no time to waste in transforming to deliver a superior customer experience. Data is at the heart of meeting the elevated expectations of today’s connected customers.


CISOs' Salaries Expected to Edge Above $240,000 in 2018

A candidate's skills, experience, and the complexity of the role will all need to be taken into consideration when assessing which salary percentile is appropriate. "The midpoint salary is a good indicator of someone who meets the requirements of an open role," Reed says. The midpoint range for CISOs and information systems security managers have improved over the past couple of years. For example, the Dark Reading 2016 Security Salary Survey found the median annual salary of IT security management was $127,000. But fast forward to 2018: the Robert Half Technology survey expects information systems security managers to earn as much as $194,250 if in the 95th percentile salary range, followed by $164,250 for the 75th percentile, $137,000 at the midpoint, and $115,250 at the 25th percentile, according to the report.


Facebook Relents to Developer Pressure, Relicenses React

"We won't be changing our default license or React's license at this time," said Wolff, who apologized "for the amount of thrash, confusion, and uncertainty this has caused the React and open source communities." Furthermore, he said, "We know this is painful, especially for teams that feel like they're going to need to rewrite large parts of their project to remove React or other dependencies." One developer in that camp is Matt Mullenweg -- the main guy behind the popular WordPress platform -- who threatened to redo project Gutenberg, a "block editor" from the WordPress community designed "to make adding rich content to WordPress simple and enjoyable." "The Gutenberg team is going to take a step back and rewrite Gutenberg using a different library," Mullenweg said in a Sept. 14 post.



Quote for the day:


"No plan survives contact with the future. No security is future proof. That's the joy and terror of cyber security." -- J Wolfgang Goerlich‏


Daily Tech Digest - September 24, 2017

How to Get One Trillion Devices Online

I think it’s easy to paint the optimistic picture of what, if we get all of this right, it could mean for our future. One trillion devices isn’t an absurd number. But these types of new technology can be very fragile. It’s interesting comparing CRISPR [the gene-editing technology] to genetically modified crops: GM crops had some bad publicity early on, and that essentially killed the area for a while, whereas CRISPR has had lots of positive publicity: it’s cured cancer in children. IoT will be similar. If there are missteps early on, people will lose faith, so we have to crack those problems, at least to a point where the good vastly outweighs the bad.


The developers vs enterprise architects showdown

Planning out and managing microservices seems like another area where EAs have a strong role for both initial leadership and ongoing governance. Sure, you want to try your best to adopt this hype-y practice of modularising all those little services your organisation uses, but sooner or later you’ll end up with a ball of services that might be duplicative to the point of being confusing. It’s all well and good for developer teams to have more freedoms on defining the the services they use and which one they choose to use, but you probably don’t want, for example, to have five different ways to do single sign-on. Each individual team likely shouldn’t be relied upon to do this cross-portfolio hygiene work and would benefit from an EA-like role instead, someone minding the big ball of microservices.



brain

Human Brain Gets Connected to the Internet for the First Time

“Brainternet is a new frontier in brain-computer interface systems,” said Adam Pantanowitz, ... According to him, we’re presently lacking in easily-comprehensible data about the mechanics of the human brain and how it processes information. The Brainternet project aims “to simplify a person’s understanding of their own brain and the brains of others.” “Ultimately, we’re aiming to enable interactivity between the user and their brain so that the user can provide a stimulus and see the response,” added Pantanowitz, noting that “Brainternet can be further improved to classify recordings through a smart phone app that will provide data for a machine-learning algorithm. In future, there could be information transferred in both directions – inputs and outputs to the brain.”


Impact of Cyber Security Trends on Test Equipment

The trend of applying cyber security practices to test systems makes sense for several reasons, most notably the increased cyber-security incidents that exploit unmonitored network devices. The second reason this trend makes sense is that security practices and technology for general-purpose IT systems are more mature. However, this trend does not make sense categorically for at least two reasons. Primarily, IT-enabled test systems are less tolerant of even small configuration changes. Users of IT systems can tolerate downtime and may not even perceive application performance differences, but special-purpose test systems (especially those used in production) often cannot tolerate them. Second, test systems often have security needs that are unique. They typically run specialized test software not used on other organization computers


This Is What Happens When a Robot Assassin Goes to Therapy

In an email to Singularity Hub, series creator EJ Kavounas said, “With everyone from Elon Musk to Stephen Hawking making dire predictions about the possible dangers of machine intelligence, we felt the character could inject black comedy while discussing real issues of consciousness and humanity’s relationship with the unknown.” Nina starts with Alastair Reynolds, a psychiatrist. During their meeting she explains her past to him, and after watching a recording in which she detonated a missile to kill someone, she breaks into tears. So we know she has feelings—or at the very least, she’s good at faking them. “The biggest thing I try to keep in mind when playing Nina is that everything she does and says was specifically programmed to mimic human behavior and language,” according to actor, Lana McKissack, who plays Nina.


What is cellular IoT and what are its use cases

While LoRa offers the benefit of addressing ultra-low-power requirements for a range of low-bit-rate IoT connectivity, it is faced with a range limitation and must piggyback an intermediary gateway before data can be aggregated and sent to a central server. The cost of deploying multiple gateways for a range of different IoT scenarios would defeat the very economic purpose of using an arguably low-cost solution like LoRa. Moreover, solutions like LoRa are not suited for a wide range of those IoT applications where HD and ultra-HD streaming is a prerequisite. 5G would potentially address a range of both low-bit-rate and ultra-HD IoT connectivity requirements, while also obviating the need to have an intermediary gateway, thus leading to additional cost savings. Moreover, 5G would have the potential to cover as many as one million IoT devices per square kilometer


Gel-soaked conductive ‘fabric’ has potential for energy storage

As electric power becomes more important for everything from ubiquitous computing to transport, researchers are increasingly looking for ways to avoid some of the drawbacks of current electricity storage devices. Whether they are batteries, which release a steady stream of electric current, or supercapacitors, which release a sharper burst of charge, storage devices depend on conductive electrolyte fluids to carry charge between their electrodes. Susceptible to leakage and often flammable, these fluids have been behind many of the reported problems with batteries in recent years, including fires on board aircraft and exploding tablet computers (the later being caused by short-circuiting inside miniaturised batteries).


Lambda vs. EC2

Unlike its predecessors, the underlying Lambda infrastructure is entirely unavailable to sysadmins or developers. Scale is not configurable, instead Lambda reacts to usage and scales up automatically. Instead of using EC2, Lambdas instead use ECS, and the containers are not available for modification. In place of a load balancer, or an endpoint provided by Amazon, if you want to make Lambdas accessible to the web it must be done through an API Gateway, which acts as a URL router to Lambda functions. ... One of the major advantages touted by Amazon for using Lambda was reduced cost. The cost model of Lambda is time-based: you’re charged for requests and request duration. You’re allotted a certain number of seconds of use that varies with the amount of memory you require. Likewise, the price per MS varies with the amount of memory you require.


Artificial Intelligence: The Gap between Promise and Practice

The majority of companies underestimate the importance of rich and diverse data sets to train algorithms, and especially the value of “negative data” associated with failure to successfully execute a task. Talent shortages and unequal access to data engineers and AI experts compound matters. Privacy and other regulations as well as consumer mistrust also temper progress. Whereas such barriers may be expected to decrease over time, there are also more subtle barriers to AI’s adoption that will need to be overcome to unlock its full potential. Algorithmic prowess is often deployed locally, on discrete tasks; but improved learning and execution for one step of a process does not usually improve the effectiveness of the entire process.


Researchers Develop Solar Cells That Can Be Sewn Onto Clothing

“The ideal wearable portable solar cell would be a piece of textile. That exists in the lab but is not a sellable product.” This new research from the RIKEN and Tokyo teams has taken that textile a big step forward from lab curiosity to actual product. What they have done is create a cell so small and flexible that it could, in time, be seamlessly woven into our clothing, rather than awkwardly placed on the outside of a jacket. These solar cells are phenomenally thin, measuring just three millionths of a meter in thickness. Given a special coating that can let light in while keeping water and air out, the cell was able to keep efficiently gathering solar energy even after being soaked in water or bent completely out of its original shape.



Quote for the day:


"Change is the end result of all true learning." -- Leo Buscaglia


Daily Tech Digest - September 23, 2017

Domain-Driven Design Even More Relevant Now

Compromises and trade-offs in software are unavoidable, and Evans encouraged everyone to accept that "Not all of a large system is going to be well designed." Just as "good fences make good neighbors", bounded contexts shield good parts of the system from the bad. It therefore stands to reason that not all development will be within well-defined bounded contexts, nor will every project follow DDD. While developers often lament working on legacy systems, Evans places high value on legacy systems, as they are often the money makers for companies. His encouragement for developers to "hope that someday, your system is going to be a legacy system" was met with applause.


At its most basic level, the monetary system is built around the idea of storing and transferring value. Banks are not going to disappear; there are still high-level efficiencies and advantages to having banks aggregate stored value and deploy it at a targeted rate of return. For example, a bank can write thousands of mortgages and then securitize a portion of said mortgages; this is never going to be a process suitable for the crowdfunding model. Blockchain technology creates numerous benefits across industries and applications, especially in regard to value-transfer. Banks can realize extraordinary efficiencies, streamline their back-office functions and reduce risk in the process. Smart contracts introduce the added dynamic of constraints and conditional operations for transferring or storing value only when certain conditions have been met and verified.


The Digital Twin: Key Component of IoT

In the real world, this might be a machine going into different fault and run states, where the effect of an input on the machine's state depends on the state the machine is in at the time. If I go far enough back in time, I realize that my system did receive an input "A", and so by the rules of my system, the later "B" results in my model producing the output "X". However if I don't go back far enough, I will think that I only got a "B", and the output should be "Y". But how far back is "far enough"? The input "A" might have arrived 100 milliseconds ago, or it might have arrived yesterday, or just before the week-end. Which means that I cannot just pick up and run my model over a selected time period any time I want to get an answer -- apart from the sheer impracticality of crunching the numbers while the User waits for an answer.


How Startups Can Source Data To Build Machine Intelligence

Data is the fuel of the new AI-based economy. Companies, consumers and web-connected devices create terabytes of data that enforce AI research and innovation. Some companies, like Google and Facebook, acquire data thanks to their users who provide ratings, clicks and search queries. For other companies, data acquisition may be a complicated process, especially if they need an enterprise solution for a limited number of members instead of a one-size-fits-all solution for millions of users. Luckily, the emerging AI markets offer a broad range of options for companies to kickstart their AI strategies. As a venture studio partner, I see startups struggling with sourcing the initial data sets for their business problems. That's why I've listed the most popular ways young companies can source data for their AI businesses.


The Challenges of Developing Connected Devices

Many startups can afford to be scrappy at the start and only have a few employees while gaining momentum; when your product is a connected device it is more difficult to build a small team with the range of skills needed to launch a successful product. Luckily, there are plenty of external resources available to these companies that can help. If a founding team is strong with hardware, they can use an agency in order to get their first software suite built. There are also services that they can leverage to help with the build and distribution chain. Any place where work can be offloaded in order to focus on value increases their chances of success. They can then start hiring out a team to save money once they have traction.


bitcoin-18135031280pd.jpg

New alliance advocates the blockchain to improve IoT security, trust

The alliance says that the groups' open-source tools and property will help the enterprise register IoT devices and create event logs on decentralized systems, which in turn will lead to a trusted IoT ecosystem which links cryptographic registration, "thing" identities, and metadata ... "The world is beginning to recognize the potential of blockchain technology to fundamentally reshape the way business is done globally - and we're still just scratching the surface," said Ryan Orr, CEO of Chronicled. "At this early stage we think it's vitally important to establish an inclusive framework that ensures openness, trust, and interoperability among the many parties, in both the public and private sectors, that we believe will begin to adopt blockchain technology over the next several years."


Ethereum’s inventor on how “initial coin offerings” are a new way of funding the internet

In general, you know that when you have public goods, public goods are going to be in very many cases underfunded. So the interesting thing with a lot of these blockchain protocols is that for the first time you have a way to create protocols and have protocols that actually manage to fund themselves in some way. If this kind of approach takes off, potentially, it could end up drastically increasing the quality of bottom-level protocols that we use to interact with each other in various ways. So ethereum is obviously one example of that, we had the ether sale, and we got about $8 to $9 million by, I guess, basically selling off a huge block of ether. If you look at lots of cryptocurrencies, lots of layer-two kind of projects on top of ethereum, a lot of them tend to use a similar model.


New Theory Cracks Open the Black Box of Deep Learning

It remains to be seen whether the information bottleneck governs all deep-learning regimes, or whether there are other routes to generalization besides compression. Some AI experts see Tishby’s idea as one of many important theoretical insights about deep learning to have emerged recently. Andrew Saxe, an AI researcher and theoretical neuroscientist at Harvard University, noted that certain very large deep neural networks don’t seem to need a drawn-out compression phase in order to generalize well. Instead, researchers program in something called early stopping, which cuts training short to prevent the network from encoding too many correlations in the first place.


How to Measure Continuous Delivery

Continuous delivery is all about improving the stability and speed of your release process, so unsurprisingly you should measure stability and speed! Those are intangibles, but they’re not hard to measure. In How To Measure Anything, Douglas Hubbard shows how to use clarification chains to measure intangibles - you create tangible, related metrics that represent the same thing. Luckily for us, the measures have been identified for us. In the annual State Of DevOps Report Nicole Forsgren, Jez Humble, et al. have measured how stability and throughput improve when organisations adopt continuous delivery practices. They measure stability with Failure Rate and Failure Recovery Time, and they measure throughput with Lead Time and Frequency. I’ve been a big fan of Nicole and Jez’s work since 2013


The Decline of the Enterprise Architect

No matter their place in a lumbering bureaucracy or how many eye-rolls they may inspire among developers, these people are smart, competent, and valuable to their organizations. So my opinions and criticisms have nothing to do with the humans involved. That said, I think this role is on the decline, and I think that’s good. This role exists in the space among many large software groups. In the old days, they coordinated in elaborate, mutually dependent waterfall dances. These days, they “go agile” with methodologies like SAFe, which help them give their waterfall process cooler, more modern sounding names, like “hardening sprint” instead of “testing phase.” In both cases, the enterprise architect has a home, attending committee-like meetings about how to orchestrate the collaboration among these groups.



Quote for the day:


"Your excuses are nothing more than the lies your fears have sold you." -- Robin Sharma


Daily Tech Digest - September 22, 2017

6 Mistakes that will kill your Agile transformation even before it begins

Scrum, DevOps, SAFe, Kanban, Continuous Delivery. With so many different buzz words floating around the Agile sphere, it can be easy for companies to get excited and bite off more than they can chew. Every organization is different in its readiness to adopt Agile and needs to carefully consider many factors when deciding how to start the journey. Smaller organizations or teams such as start-ups or IT departmenst of larger companies may be able to immediately start practicing Scrum. On the other hand, larger organizations that have traditionally worked in a waterfall fashion or are in heavily regulated industries, may find it difficult to make the big changes that accompany a framework such as Scrum. As a result they may get discouraged or quit altogether if they run into problems, early on.


Q&A on the Book "Humans vs Computers"

Modern software delivery is a constant struggle to abstract, simplify and model some part of the real world into a useful automated process. However, lack of domain knowledge, time pressure and imperfect information often lead us to oversimplify the real-world, so edge cases fall through the cracks. For example, complex distributed systems built around microservices often require some kind of production monitoring that tries to process transactions end-to-end with test data, and remove those test cases at the end of a successful check. It's difficult to imagine how something like that can cause serious damage, until you know that someone called Jeff Sample ended up stranded in Buenos Aires when the airline operating the connecting flight deleted his ticket without any trace.


EU’s new data privacy law creates headaches for U.S. banks

“A European data subject can make requests on what data the bank has on it, and can make changes and request deletion of the data,” said Roth, who is a former chief privacy officer at American Express. “These require business practices that banks don’t have in the U.S.” Companies with multiple legacy systems will face one of the toughest challenges, Dingle said. “The first problem you will have when you deal with GDPR is that you have to somehow be able to reconcile how the data flows between all these different databases, even though they were made in different times, they may have different formats [and] the data might be called something different,” she said. “That’s why a lot of these beautiful ideas of GDPR are very difficult in reality for people to execute on.”


Training soft skills into AI technology

While it was once thought that computers would never be able to demonstrate true emotional intelligence, examples are starting to blur those lines. In one study, computers were able to detect criminals with a high degree of accuracy just by looking at their facial features and movements. This means they’re getting good at reading people, a key social attribute that aligns with some degree of EQ. Closer examination shows that while the computers may be able to read people, that doesn’t necessarily mean they can understand people. They were able to pick out the criminals by analyzing incredible amounts of data about facial features. The decisions the computers made were based not on insight, but on algorithms. There are plenty of similar examples in which a machine can demonstrate the appearance of empathy when they’re actually just running the numbers.


Digital Disruption Demands Demystification

There are several broad themes to this year’s hype cycle, with a particular focus on disruption and disruptive opportunities. In the context of disruption, some of these are still at the innovation trigger stage–being used by some brave souls willing to take a change and deal with challenges of new technologies (or applications of technology). Broadly, Gartner sees AI and human-centered design in this stage. Further along the curve is customer experience and intimacy. Some grouping are moving toward the trough of disillusionment, as the hype grows without being replaced by enough tangible examples and paths to success. Finally the core areas of the Nexus of Forces (cloud, mobile, social, and information) are rapidly moving toward the plateau of productivity. Exploring the details will help you have appropriate expectations as you embark on your change initiatives.


What Is Edge Computing And How It's Changing The Network

Edge computing is a “mesh network of micro data centers that process or store critical data locally and push all received data to a central data center or cloud storage repository, in a footprint of less than 100 square feet,” according to research firm IDC. It is typically referred to in IoT use cases, where edge devices would collect data – sometimes massive amounts of it – and send it all to a data center or cloud for processing. Edge computing triages the data locally so some of it is processed locally, reducing the backhaul traffic to the central repository. Typically, this is done by the IoT devices transferring the data to a local device that includes compute, storage and network connectivity in a small form factor.


Three ways the Internet of Things and the GDPR will impact Third Party Risk

With the IoT, the impact could potentially be even more threatening. Instead of “just” stealing data, an IoT hack could potentially take over the functionality of the device being hacked. For example, a IoT-hacked car could be driven off the road, or the systems and controls of a home could be manipulated. Another issue is the potential loopholes in firewalls – giving access to networks – that a poorly-designed IoT device could provide ... The GDPR explicitly introduces a general mandatory notification regime. When there is a personal data breach, a supervisory authority needs to be notified within 72 hours once an organization becomes aware of a breach, and impacted individuals must also be notified if a certain threshold is met.


The Top 10 Adages in Continuous Deployment

Continuous deployment involves automatically testing incremental software changes and frequently deploying them to production environments. With it, developers' changes can reach customers in days or even hours. Such ultrafast changes have fundamentally shifted much of the software engineering landscape, with a wide-ranging impact on organizations' culture, skills, and practices. To study this fundamental shift, researchers facilitated a one-day Continuous Deployment Summit on the Facebook campus in July 2015. The summit aimed to share best practices and challenges in transitioning to continuous deployment. It was attended by one representative each from Cisco, Facebook, Google, IBM, LexisNexis, Microsoft, Mozilla, Netflix, Red Hat, and SAS.


Java SE 9 and Java EE 8 Released Today

"Introducing a module system into a language and platform like Java SE, 20 years after its creation, when a large portion of the world's systems are running on it, is a very serious change," said George Saab, ... Once developers get used to it, modularity has the potential to make their lives easier by allowing them to, as Oracle puts it, "reliably assemble and maintain sophisticated applications." The module system reduces the size and complexity of both Java applications and the core Java runtime itself. It also makes the JDK more flexible, allowing developers to bundle just those parts of the JDK that are needed to run an application when deploying to the cloud. "This version of Java SE will provide millions of developers [with] the updated tools they need to continue building next-generation applications with ease, performance and agility," Saab said today in a statement.


Five changes to the way people will use banks in the future

While banks in the past have taken something of a one-size-fits-all approach, expect services to become much more tailored to your individual needs in the future. Behind this development will be data - or, rather, the more intelligent use of data - by banks. From the way we spend our money to the things we actually buy and the devices we use to log in to our account, banks can use data to build unique profiles of their customers. There are also external data points that can be used, from social media profiles for example. Of course, no bank should be using any of this data without the customer’s explicit consent, but the potential for highly personalised banking services should be a strong draw for many people. For instance, who wouldn’t appreciate discount offers on items you buy regularly sent directly to - and redeemable through - their smartphone?



Quote for the day:


"Anyone who lives within their means suffers from a lack of imagination." -- Oscar Wilde


Daily Tech Digest - September 21, 2017

Manage access control using Redis Bitfields

Access control based on action is a flexible, granular approach to securing your resources. Each user is given a list of things they can do and when the user attempts to perform any action, you check the user’s capabilities against what is required of that action. Sounds simple enough, right? This can be a tricky thing to code and it has to be as fast as possible because whatever latency, transit, or computation time this step requires is overhead that cuts into the processing you need to do with the rest of your app (likely stuff you care more about than capabilities and privileges). First, let’s look at a highly efficient way of storing capabilities and later we’ll explore some more advanced functionality. The heart of this approach is to use binary data, which might seem strange. Redis, unlike many databases, can manipulate and store binary data directly.


What Is A Fileless Attack? How Hackers Invade Systems Without Installing Software

Fileless malware leverages the applications already installed on a user's computer, applications that are known to be safe. For example, exploit kits can target browser vulnerabilities to make the browser run malicious code, or take advantage of Microsoft Word macros, or use Microsoft's Powershell utility. "Software vulnerabilities in the software already installed are necessary to carry out a fileless attack, so the most important step in prevention is patch and update not only the operating system, but software applications," says Jon Heimerl, manager of the threat intelligence communications team at NTT Security. "Browser plugins are the most overlooked applications in the patch management process and the most targeted in fileless infections."


Google tightens grip on Android hardware with HTC deal

Google never entirely quit the hardware business. Since selling Moto, it has continued to release smartphones and tablets under its own brand, but these were designed and manufactured by other companies, including LG and HTC. Now Google is taking greater control of that design process, paying US$1.1 billion to HTC to acquire the team behind its Pixel devices. It will also receive a non-exclusive license to some HTC intellectual property, the companies said Thursday. The number of HTC employees affected by the deal is around 2,000, according to Reuters. The deal won't give Google any manufacturing capabilities: It will still have to outsource that work to others. And it won't knock HTC out of the smartphone market altogether: It still has a team working on the successor to its U11 flagship, launched earlier this year


DDoS protection, mitigation and defense: 7 essential tips

“A disaster recovery plan and tested procedures should also be in place in the event a business-impacting DDoS attack does occur, including good public messaging. Diversity of infrastructure both in type and geography can also help mitigate against DDoS as well as appropriate hybridization with public and private cloud," says Day. “Any large enterprise should start with network level protection with multiple WAN entry points and agreements with the large traffic scrubbing providers (such as Akamai or F5) to mitigate and re-route attacks before they get to your edge. No physical DDoS devices can keep up with WAN speed attacks, so they must be first scrubbed in the cloud. Make sure that your operations staff has procedures in place to easily re-route traffic for scrubbing and also fail over network devices that get saturated,” says Scott Carlson, technical fellow at BeyondTrust.


The Dangers of the Hackable Car

As vehicles fill up with more digital controls and internet-connected devices, they’re becoming more vulnerable to cybercriminals, who can hack into those systems just like they can attack computers. Almost any digitally connected device in a car could become an entry point to the vehicle’s central communications network, opening a door for hackers to potentially take control by, for instance, disabling the engine or brakes. There have been only a handful of successful hacks on vehicles so far, carried out mostly to demonstrate potential weaknesses—such as shutting down moving a car and taking control of another’s steering. But security experts paint a grim picture of what might lie ahead. They see a growing threat from malicious hackers who access cars remotely and keep their doors locked until a ransom is paid.


Microsoft launches data security technology for Windows Server, Azure

Microsoft claims the service, called Azure confidential computing, makes it the first public cloud provider to offer encryption of data while in use. Encrypting data while it is being manipulated is pretty CPU-intensive, and there is no word on the performance impact of this service.  “Despite advanced cybersecurity controls and mitigations, some customers are reluctant to move their most sensitive data to the cloud for fear of attacks against their data when it is in use,” Mark Russinovich, Microsoft Azure CTO, wrote in a company blog post. “With confidential computing, they can move the data to Azure knowing that it is safe not only at rest, but also in use from [various] threats.” Azure confidential computing uses a trusted execution environment (TEE) to ensure there is no way to view data from the outside, such as via a bug in the OS or a hacker who has gained admin privileges.


CIO interview: John Mountain, Starling Bank

Starling even offers software development kits to third parties to make it easier for them to develop services for its customers. “For the most commonly used languages, we do half the work for them,” he says. “This is what companies like Apple do. They say ‘there is an API [application programming interface] but we want to go a bit richer than that’ and do some of the coding themselves.” In fact, Mountain wants anything that is not core to the business, whether it be accounting software or a customer money management service, to be supplied while Starling’s internal team focuses on core competencies. “We visualise our platform as a series of concentric circles, where we ask ourselves how fundamental to the business a certain piece of software is,” he says. “Everything judged to be at the core of the operation we write ourselves.


Assemble tools to address IT compliance standards up the stack

Security and compliance work hand in hand. The threat landscape is more complex due to distributed applications being broken down into components, an increased variety of end points and dispersed data centers. "An increase in the volume and complexity of cybersecurity breaches and the potential damage that those events have on both business operations and brand reputation [are] driving greater demand for IT and security and risk management solutions," said Angela Gelnaw, security products and solutions analyst at IDC. Consequently, businesses take an expensive, multi-tiered approach to secure information. IDC expects enterprise security spending will increase from $73.7 billion in 2016 to $101.6 billion in 2020. The compound annual growth rate of 8.3% is more than twice the rate of overall IT spending that IDC predicts during the five-year forecast period.


What's Holding Blockchain Back From Large-Scale Adoption?

For those of us who believe wholeheartedly in the future of this technology, it’s up to us to figure out how we can best explain what’s actually happening and why it’s important. For example, I recently spoke at the 100x Blockchain Online Summit, and it was enthralling to dive into such deep use cases and talk through specific problems that blockchain can solve, one of which was counterfeiting in big pharma. But to an everyday consumer, or even someone with a strong tech background, the terminology alone creates some roadblocks. The biggest reason education is the first obstacle is that you have to consider who really needs to buy into using blockchain technology in order for it to scale. It’s not just theorists and coders. It’s CEOs, heads of marketing and business development, even investors who are going to decide to foot the bill—or invest in the Ethereum platform, period.


How to choose a database for your mobile apps

To require an Internet connection for mobile applications is to live in the past. If apps rely on a connection, odds are high that the experience will be sluggish and unpredictable. To avoid reliance on the network, providers of databases and cloud services have added synchronization and offline capabilities to their mobile offerings. Solutions like Couchbase’s Couchbase Mobile, Microsoft’s Azure Mobile Services, Amazon’s Cognito, and Google’s Firebase offer the all-important sync that enables apps to work both online and offline.  With so many offerings available, how does a mobile developer select the right technology for the right application? The following six key criteria are most important when evaluating mobile solutions: platform support, security, modeling flexibility, conflict resolution, sync optimization, and topology support.



Quote for the day:


"A treasured memory is the lasting gift of time well spent." -- Tim Fargo