Daily Tech Digest - January 31, 2019

Singapore releases guidelines for deployment of autonomous vehicles

Permanent Secretary for Transport and chairman of the Committee on Autonomous Road Transport for Singapore, Loh Ngai Seng, said plans were underway to launch a pilot deployment of autonomous vehicles in Punggol, Tengah, and Jurong Innovation District in the early 2020s, and TR 68 would help guide industry players in "the safe and effective deployment" of such vehicles in the city-state.  Enterprise Singapore's director-general of quality and excellence group Choy Sauw Kook said: "In addition to safety, TR 68 provides a strong foundation that will ensure interoperability of data and cybersecurity that are necessary for the deployment of autonomous vehicles in an urban environment. The TR 68 will also help to build up the autonomous vehicle ecosystem, including startups and SMEs (small and midsize enterprises) as well as testing, inspection, and certification service providers."


Network programmability in 5G: an invisible goldmine for service providers and industry

5G network programmability value chain
5G promises many disruptive functionalities, such as ultra-low latency communication, high bandwidth/throughput, higher security, and network slicing, all of which embed the potential to address new business opportunities not addressed by service providers today. But another functionality not always mentioned--and that has equal business potential--is network exposure, which can enable new levels of programmability in telecom core networks. Programmability in 5G Core networks allows providers to open up telecom network capabilities and services to third-party developers allowing them to create new use cases that don’t exist today. This is possible thanks to standardized APIs on the new network architecture for 5G. With APIs, a new frontier for business innovation in telecom will surge. Application developer partners will focus on new services applications, while telco service providers will focus on a new dimension of experience called “developer experience” and increase its position in the OTT value chain.


Internet Of Things (IoT): 5 Essential Ways Every Company Should Use It

Internet Of Things (IoT): 5 Essential Ways Every Company Should Use It
Strategic decision making is where the senior leadership team identifies the critical questions it needs answering. Operational decision-making is where data and analytics are made available to everyone in the organization, often via a self-service tool, to inform data-driven decision at all levels. More and more companies make IoT-enabled products which connect them directly to their customers’ behaviours and preferences. For example, Fitbit knows how much we all exercise and what our normal sleeping patterns are. Samsung can collect usage data from their smart TVs. Elevator manufacturer Kone learns how their customers are using their elevators and Rolls Royce knows how airlines use the jet engines they make. Even companies that don’t make IoT devices can often gain access to data from other people’s devices, just think app makers that are able to collect user data because of the data collection and connectivity capabilities of the smart phones or tablets that run them. Used correctly, companies can leverage these insights to make quicker and better business decisions.


No-deal Brexit could lead to data issues, MPs told


The no-deal Brexit planning notice warns that the legal framework for transferring personal data from organisations in the EU to organisations in the UK would have to change when the country leaves the EU.  This means that although businesses will be able to continue to send personal data from the UK to the EU, and would “at the point of exit continue to allow the free flow of personal data from the UK to the EU”, it may not be the same for the other way around. “We’ve been saying for a while that we would like the adequacy discussions to start as soon as possible. But the EU, as with everything else, is saying they won’t start the discussions until we are a third country. So, I’d be surprised if a decision could be made in under a year,” Derrington told the committee. There are also issues relating to legacy data, which was transferred from the EU to the UK before Brexit.


DARPA explores new computer architectures to fix security between systems

DARPA explores new computer architectures to fix security between systems
A better solution, then, in today's environment is to accept that users need or want to share data and to figure out how to keep the important bits more private, particularly as the data crosses networks and systems, with all having varying levels of, and types of, security implementations and ownership. The GAPS thrust will be in isolating the sensitive “high-risk” transactions and providing what the group calls “physically provable guarantees” or assurances. A new cross-network architecture, tracking, and data security will be developed that creates “protections that can be physically enforced at system runtime.” How they intend to do that is still to be decided. Radical forms of VPNs — an encrypted pipe through the internet would be today’s attempted solution. Whichever method they choose will be part of a $1.5 billion, five-year investment in government and defense electronics systems. And enterprise and the consumer may benefit. “As cloud systems proliferate, most people still have some information that they want to physically track, not just entrust to the ether,” says Walter Weiss, DARPA program manager, in the release.


There's more to WSL than Ubuntu

By integrating WSL with the updated Windows command-line environment, it's possible to integrate it directly with any application that offers a terminal. You can write code in Visual Studio Code, save it directly to a Linux filesystem, and test it from the built-in terminal, all without leaving your PC. And when it's time to deploy to a build system, you don't need to worry about line-ending formats or having to test code on separate systems. Support for SSH also ensures that you've got secure remote access to any Linux servers, in your data center or in the cloud. If you're using WSL to develop and test server applications, then you'll probably want to install SuSE Enterprise Server. It's a popular Linux server, and can be configured to handle most server tasks. With WSL now supported on Windows Server, you can use it to build test environments for cloud applications before deploying them on Azure or another public cloud. SuSE bundles a one-year developer subscription, which gives you more support resources than its standard community-based support forums.


Why we need less people, more skills for digital transformation

Why we need less people, more skill image
The fundamental argument comes down to value. Often in business, a corporate mentality exists in which executives boast about the number of people they have working for their company or on a project because they believe that provides the best value for their clients. This attitude has existed for more than two decades yet companies are still failing to understand that this might not provide the best value for their business or clients. Companies need to do more research to understand what works for them as an individual business, and often this means they don’t need to hire lots of people. Rather, they need the right people. While it may seem reassuring to have a large team working on an expensive project, often the work is easier, smoother and quicker when led by a small team who are highly-skilled, have good experience and who can be there working on the ground together, not spread around or working remotely. This may be more expensive at first, but it is worth it in the long term.



The FTC's cyberinsurance tips: A must-read for small business owners

cyberinsurance.jpg
Dan Smith, president, co-founder, and COO of Zeguro, a cybersecurity company that has grabbed the attention of investors, admits in this PYMNTS article the company came under a spear-phishing attack recently. It was unsuccessful, but it pointed out a very real need. Most small businesses do not think they need cyberinsurance (only 4% in the US currently have it) or do not know it's available. Smith adds that another problem area is that brokers providing the insurance are not spending enough time explaining it or may not understand it themselves.To fix the situation, Smith, in the PYMNTS article, announced that Zeguro will be partnering with the QBE Insurance Group to offer tailored cyberinsurance solutions. According to Smith, the idea is to use the company's expertise and acquired cybersecurity intelligence to craft the appropriate cyberinsurance solution for each client. Insurance on any level is a complicated subject, and then add the complexity of trying to secure a digital infrastructure from cybercriminals—using a partnership like Zeguro and QBE Insurance Group seems like good business.


What programming languages rule the Internet of Things?

What programming languages rule the Internet of Things?
Clearly, there’s a consensus set of top-tier IoT programming languages, but all of the top contenders have their own benefits and use cases. Java, the overall most popular IoT programming language, works in a wide variety of environments — from the backend to mobile apps — and dominates in gateways and in the cloud. C is generally considered the key programming language for embedded IoT devices, while C++ is the most common choice for more complex Linux implementations. Python, meanwhile, is well suited for data-intensive applications. Given the complexities, maybe IoT for All put it best. The site noted that, “While Java is the most used language for IoT development, JavaScript and Python are close on Java's heels for different subdomains of IoT development.” Perhaps, the most salient prediction, though, turns up all over the web: IoT development is multi-lingual, and it's likely to remain multi-lingual in the future.


How to accelerate digital identity in the UK


To encourage the reuse of a digital identity, the critical first step involves striking the right balance in the initial creation of a digital identity, based on the appropriate level of trust and friction for a first-time interaction. Digital services must be designed with the appropriate initial levels of trust, subsequently increasing levels of trust when required. It is a mistake to start with the maximum level of trust, which may be too high for the service. Instead, enhance trust as and when required. Digital identity standards allow services to map their increasing identity trust requirements effectively. Digital identity should be used at the point of need, with appropriate controls where absolutely necessary to complete the task. There is evidence that motivated users achieve high levels of success in verifying their identity in the right circumstances. The UK identity standards, built in response to real-world threats and risks, are world-leading, support the European Union’s eIDAS equivalence, and are closely aligned to the US NIST 800-63-A standard.



Quote for the day:


"Leading people is like cooking. Don_t stir too much; It annoys the ingredients_and spoils the food" -- Rick Julian


Daily Tech Digest - January 30, 2019

Cisco serves up flexible data-center options
Cisco has now extended ACI with ACI Anywhere to the cloud – specifically Amazon AWS and Microsoft Azure environments. The idea is that customers will have the flexibility to run and control applications anywhere they want across private or public clouds or at the edge and while maintaining consistent network policies across their entire domain. “There is nothing centered about data centers anymore,” said Roland Acra, senior vice president and general manager for Cisco’s Data Center Networking business. “IT teams have been forced to make a hard choice: stay with their on-premises data centers with a rich set of tools of their choice for automation or assurance or security; or move to the cloud, where a different set of capabilities can make consistent compliance a true challenge. ACI Anywhere removes that challenge and places workloads where it makes the most sense regardless of the platform or hypervisor.” ACI Anywhere would, for example, let policies configured through Cisco’s SDN APIC use native APIs offered by a public-cloud provider to orchestrate changes within both the private and public cloud environments, Cisco said.


Unconfigured IoT is a security risk, warns researcher

Many IoT devices work initially in an access point mode, so users can connect to the device using a smartphone to reconfigure it to become a client on the wireless network by entering the network security key, thereby making it much more secure. But businesses and consumers will often elect not to connect appliances to the internet, believing this is safer. ...  “This means that if the device remains unconfigured, it will remain in the default state, making it even more vulnerable than if it were connected to the internet and configured,” said Munro. “Although this opens up another set of vulnerabilities, organisations and consumers are becoming increasingly aware of these vulnerabilities and are therefore more likely to be aware of the risks and how to mitigate them.” But with an unconfigured device, attackers could use a war driving or access mapping attack, which would make it easy to compromise these devices, said Munro, because the attacker could identify a target wireless network using a geolocation site, such as wigle.net, that shows wireless access points in any given location and enables account holders to search its database for unconfigured IoT devices.


Serverless computing’s dark side: less portability for your apps

Serverless computing’s dark side: less portability for your apps
How that serverless development platforms calls into your serverless code can vary, and there is not uniformity between public clouds. Most developers who develop applications on serverless cloud-based systems couple their code tightly to a public cloud provider’s native APIs. That can make it hard, or unviable, to move the code to another platforms. The long and short of this that if you build an application on a cloud-native serverless system, it’s both difficult to move to another cloud provider, or back to on-premises. I don’t mean\ to pick on serverless systems; they are very handy. However, more and more I’m seeing enterprises that demand portability when picking cloud providers and application development and deployment platforms often opt for what’s fastest, cheapest, and easiest. Portability be dammed. Of course, containers are also growing by leaps and bounds, and one of the advantages of containers is portability. However, they take extra work, and they need to be built with a container architecture in mind to be effective.


Success or Burnout? Q&A on How Personal Agility Can Help

Personal Agility is a simple coaching framework; it is based on just six powerful questions, a weekly event for asking the questions, and an “information radiator” to help you understand and act upon the answers. You can do it yourself without needing agreement or permission from anyone else! The key question “What really matters?” provides guidance for deciding how to spend your time. The next question, “What did you accomplish last week?” helps you understand where you are and to feel good about yourself and what you’ve done! The next questions help you to figure out what is (or is not) important to do this week. “What could you do?” looks at possibilities; “Of those things, which are important or urgent?” helps you to identify the essentials; finally, “Which ones do you want to get done this week?” helps you set a course with realistic objectives, so you can make steady progress to achieve bigger goals. Finally “Who can help?” is a classic coaching question that helps you get unstuck.


IT leaders must address integration to support business ecosystem


The survey found that almost half (48%) of organisations want to modernise their IT in order to compete more effectively in today’s digital business landscape. Respondents said modernisation is key to consolidating disparate technologies, automating data transaction processes and gaining visibility into their critical data flows. However, the research found that modernisation is one of the enterprise’s biggest challenges. According to Cleo, while the surveyed IT decision-makers understand the limitations and high maintenance cost of legacy technologies, they also recognise the systems’ importance to day-to-day operations. In Cleo’s experience, a major part of digital transformation is balancing old and new technologies, which means integrating legacy systems with modern applications cost-effectively and without disruption. For this reason, enterprises must simultaneously maintain legacy systems while adopting newer cloud services and software-as-a-service (SaaS) solutions to engage in and support how business is done today, it said.


How to Estimate Software Projects in A Test-Driven Development Environment

A good project manager intentionally limits the amount of information available to participants for discussion. The less information is provided, the lower the chance of an error. If we look back at the above description, what’s in it for us in it? First, it helps us define the user. In our case, it’s a registered user who has previously placed an order on the website. Second, the required functionality should have time and data limitations. Third and very importantly, the action that the user performs is atomic. Sequences or non-linear sequences of actions indicated in the description of the functionality are the roads straight to hell. And for all the participants involved, not just for the customer! Subjectively speaking, the ideal user stories imply that the user needs a minute or less to become aware of how to perform this or that action. In this case, by “aware” we mean that a user has already performed the same or very similar action in a different application.


Japan's IoT Security Strategy: Break Into Devices

Japan's IoT Security Strategy: Break Into Devices
Identifying potentially vulnerable IoT devices that face the internet can be accomplished using search engines such as Shodan, which allow for search queries based on certain parameters. Once a device has been found, taking it to the next level - attempting to log into the device - is generally a criminal offense in most countries. That presumably is the case in Japan as well and the reason why the law had to be modified to make it legal for the survey (see: Could a Defensive Hack Fix the Internet of Things?). With the law changed and permission to proceed, it should be easier to identify vulnerable devices. The larger problem is trying to resolve the vulnerabilities. Fixing vulnerabilities that lead to large botnets has been vexing. A decade ago, attackers commandeered large networks of desktop computers via browser and operating system vulnerabilities. Law enforcement agencies and private companies found success in shutting down the command-and-control servers for those botnets. But it left the problem of cleaning up infected devices, which usually involved the owners of those devices installing security patches.


CEOs and software

Neither software leaders nor CIOs can catapult their software organizations into the digital era without the right CEO support. CEO actions, or lack thereof, can stymie progress toward the software capability that digital business demands. Why? Software success depends on factors that only CEOs control. CEO control starts with funding for software initiatives — buy, build, and everything in between, plus modernization of outdated software. We track software leaders’ views on the top 10 barriers to improved software delivery (see Figure 1), with the barriers owned by CEOs highlighted in red. ... Software Delivery Speed Is Stuck“Things are moving so fast in our market,” said the CEO of a professional services firm. “I live in terror of being left behind.” Speed of software delivery is a leading indicator of health and vitality in a software-delivery organization and a signal that a software team’s digital transformation is underway. During the past five years, developers have made almost no progress in their ability to deliver software quickly


How traffic scrubbing can guard against DDoS attacks


A growing number of enterprises are investing in DDoS solutions, especially cloud-based DDoS mitigation services, with a shift away from a service-provider-centric market. A DDoS attack is one of the most complex threats that businesses can face. The goal of the individual hacker, organised criminals or state actors is to overwhelm a company’s network, website or network component, such as a router. To begin with, organisations have to determine whether a spike in traffic is legitimate or is an attack. “Without a solid understanding of baselines and historic traffic trends, organisations are unlikely to detect an attack until it is too late,” said Sherrel Roche, senior market analyst at IDC’s Asia-Pacific business and IT services research group. Landbank, the largest government-owned bank in the Philippines, has taken the step of implementing F5’s BIG-IP local traffic manager to understand its application traffic and performance better, as well as to gain full visibility into customer data as it enters and leaves an application. This enables the security team to inspect, manage and report fraudulent transactions as soon they are spotted.


DevOps Adoption Practices

Many organizations start with an environment that is full of variables: different processes, different environments, different tools, and several permutations of configurations and data. All this makes automation hard and reduces your ability to learn as each variable could be the cause of the problem. The first step is to look at all those variables and see what you can remove. Can you align the patch levels across environments? Can you deploy the same version of the application across environments? Some variables can only be removed later on, but understanding what all the variable pieces are and doing a clean-up first will make later efforts easier. ... Someone once told me: "You cannot automate what you cannot document." After all, automation is a form of documentation of a process. What is even more important is that automating a bad process just creates more problems. I also think that writing down a solution forces you to think it through in a way that verbal communication or just starting to write code does not.



Quote for the day:


"A leadership disposition guides you to take the path of most resistance and turn it into the path of least resistance." -- Dov Seidman


Daily Tech Digest - January 29, 2019

Enterprise digital transformation leaves data security behind


Thales suggests that in the rush to adopt new solutions — of which the majority of survey respondents said sensitive data is used in tandem with digitally transformative technologies — encryption and protection is not as high on the priority list. While sensitive corporate or customer data is linked to new digital solutions in 97 percent of cases, fewer than 30 percent of these same respondents said that encryption is being used within these environments. According to the survey, enterprise players that are aggressively overhauling their systems with new technologies are skating on the thinnest ice, with 28 percent running the highest risk of experiencing a data breach by ignoring suitable security standards in their enthusiasm. However, there are a few areas in which encryption usage is in above-average use. In total, 42 percent of organizations using IoT, 47 percent of businesses using container technology, and 45 percent of companies which have adopted Big Data solutions use encryption in some form or another.



Digital banking can give a boost to your bank’s bottom line.

By expanding their digital footprint, banks can reduce costs and boost financial performance while meeting consumer demand for a more streamlined and personalized customer experience. The closer you get to digital native, the more substantial the cost reductions and the greater the corresponding increase in ROE. That doesn’t mean going digital native is the right answer for every bank. Whatever you choose, the solution needs to be in line with your long-term strategy. After all, each bank has a different set of core capabilities, and not all banks are prepared for a full digital transformation. The right answer for one bank might be the wrong answer for another. ... The simplest approach is to modify the front end only, focusing on the primary ways a customer interacts with a bank. Largely a cosmetic fix, the bank designs an appealing mobile app and web interface but keeps the organization’s workflows, culture, and back-end infrastructure intact. We understand the appeal of this approach. For an organization that needs a quick win, it’s certainly the fastest route. 


Pro Tips for Developer Relations


One thing I like to do from time to time is what I call a Mashup Presentation. It requires zero content creation but requires demo creation. When I do something like this, it involves simply curating existing content from other presentations, and then work on a demo that uses most if not all pieces described in the material. It is a good way to compact and connect interesting topics and present to developers, so they don’t need to figure out the intrinsic connections between the ideas. Plus, consumes less time. A second tip is to contact the regional sales field when traveling, to find opportunities to meet with customers/prospects. May sound salesy, but it actually shows itself as a great source of real-world ideas to be covered in the future. Plus, it brings the advocate down to Earth. Finally, I like to reuse as much material as possible, whether from myself or from others. What matters is presenting something that will be never-heard-of to the audience, doesn’t matter if it is something that was created in 2001. To me, advocacy is more about bringing information than "creating information."


What the fintechs think about open banking’s progress?

“Banking data is shared via APIs, which allows two pieces of software to talk to each other and share information. Whilst these APIs already exist, there is technically a long way to go before they can be used widely. This is the limiting factor on the uptake of open banking currently, and as the technology improves, more services and products will spring up. The initial uptake of open banking has been more of a trickle than a flood. That said – there is lots of support and encouragement in existence to help providers and users start to explore the possibilities of Open Banking. Fluidly has recently been the winner of a £200,000 prize fund organised by Nesta. The fund is specifically aimed at companies who are transforming small business banking via Open Banking. Funds like this will accelerate the uptake of Open Banking as it helps to get new products built more quickly and raises the profile of what’s now available for consumers and businesses.”


McLaren: Digital transformation on and off the track

McLaren: Digital transformation on and off the track image
A great example of how this transformation impacted our business is how it empowered the technology underpinning our racing team. On each Formula 1 car there can be up to 300 sensors that communicate from once per lap to 150 times a second. That data which reached terabytes per race is now sent via a hyper-converged infrastructure from the pitlane garage to engineers, strategists and drivers. That trackside infrastructure has to be robust enough to be installed operated and then moved to over 21 global locations. The same data is transferred in real-time back here to the McLaren Production Centre. Within the cool calm centre of our mission control, the data becomes the driving force behind our simulations, strategy options and data-driven precisions. This system of data-driven collaboration relays on our cloud-based apps and infrastructure and storage solutions. In McLaren, regarding IT, we look to apply these principles across our group and have found that it can be applied to many other businesses.


Three Elements Of Next-Generation Data Management For Financial Services


A data hub can help you gain a holistic view of data assets, manage data across the full IT landscape, and integrate data into a unified view. By building the platform around a data hub, you can increase transparency of and access to all data assets, which increases agility and the speed of innovation. Critical data hub functionality includes: Open architecture foundation, allowing the hub to connect data no matter where it is physically located – in the cloud, on-premise, in Hadoop, or on cloud object storage; Data sharing and discovery across the enterprise; Single view for data asset management, supporting data analysis and governance (including pipelining, orchestration, and monitoring); Elimination of the need for centralization of data and mass data movement to a single data store; Support for complex data processing operations, such as machine learning-based analysis; Governance and orchestration for data refinement and enrichment; and Metadata catalog management, improving the visibility of data assets across the landscape. As financial services leaders increasingly realize that more trusted, connected, and intelligent data contributes to digital transformation,


The DDoS that wasn’t: a key takeaway for web domain security

screenshot-2019-01-29-at-08-20-58.png
Typical traffic forwarded to the domain before the incident contained both GET and POST requests. However, the 'malicious' traffic was only sending a stream of POST requests. "Examining all the POST requests hitting the customer's URL showed that the User-Agent fields were not being forged or otherwise altered, boosting the confidence researchers had for their conclusion that a Windows-oriented tool was responsible for this massive flood of requests," the cloud service provider says. To give the firm time to work out what was doing on, SOCC was able to mitigate most of the strange requests over the next 28 hours, leading to the discovery that the traffic smashing the URL was "the result of a warranty tool gone haywire." Buggy code, and not a botnet, was the problem. The warranty tool's errors meant that it sent constant POST requests to the domain automatically and with enough frequency to potentially take down the website. A fix was created and deployed quickly by the vendor at fault for the tool which resolved the issue.


Data Loss Prevention – Human error, insider threats and the in-between

Graphic.png
While employees in the modern workplace are getting increasingly technologically savvy, and are finding new tools to improve their productivity, they aren’t always aware of the security implications of their actions. Many of our customers are leveraging Microsoft Information Protection solutions to classify, label and protect their data. To minimize the impact on end users and their ability to be productive, these organizations often choose to empower their users to label documents themselves, by providing automatic suggestions but not auto-labeling or -protecting documents. A user can inadvertently label a document containing highly confidential information with a low sensitivity label that applies minimal access restrictions. Since the file is already encrypted, it will not be scanned by the DLP solution, but might still be accessible to unauthorized people. A bigger threat with a much higher potential for damage, is the malicious insider.


How secure is Android? Separating the myths from the facts

istock 907916926
Google’s effort has been to continually harden the Android platform. Security is an ongoing enterprise, evidenced by new innovations like an improved security model for biometrics and industry-leading capabilities that protect sensitive information on a secure, dedicated chip. An important top-line defense is through Android’s monthly security updates. Devices that are part of the Android Enterprise Recommended program receive the monthly Android security patch, guaranteed within 90 days of release. Another key innovation is the Titan M chip found in Pixel 3. The chip is integrated into Android’s Verified Boot process, ensuring the bootloader, the program that validates and loads Android, is running the correct version. This prevents bad actors from moving the device to a more vulnerable version of Android clandestinely. While debuting on the Pixel, the security community will be able to audit Titan through the open-source firmware. Google Play Protect, the world’s largest mobile threat-detection service, defends against Internet-borne threats and potentially harmful apps (PHAs).



Cybersecurity Staffing in Crisis: What Can You Do?

Cybersecurity Staffing in Crisis: What Can You Do?
Many enterprises find it easy to think of cybersecurity professionals as stemming exclusively from the STEM fields. In many ways this makes sense; IT security obviously builds itself on technology and algorithms, which would be second nature to those interested in the hard sciences. However, being too selective in the security hiring process contributes to the cybersecurity staffing crisis; it means enterprises turn away perfectly qualified candidates for not having the “right” degrees. STEM skills can strengthen your cybersecurity posture, of course. However, your InfoSec team requires other skills such as collaboration, communication, adaptability, and creativity to be fully well-rounded. What matters in a candidate may not be the knowledge of information technology but the capability of learning about technology in a productive manner. Additionally, you need to make sure you draw upon a diverse pool of information security professionals. Drawing only from a homogenous pool contributes to the cybersecurity staffing crisis.



Quote for the day:


"Leadership involves finding a parade and getting in front of it." -- John Naisbitt


Daily Tech Digest - January 28, 2019

What is Keras? The deep neural network API explained

What is Keras? The deep neural network API explained
Keras was created to be user friendly, modular, easy to extend, and to work with Python. The API was “designed for human beings, not machines,” and “follows best practices for reducing cognitive load.” Neural layers, cost functions, optimizers, initialization schemes, activation functions, and regularization schemes are all standalone modules that you can combine to create new models. New modules are simple to add, as new classes and functions. Models are defined in Python code, not separate model configuration files. The biggest reasons to use Keras stem from its guiding principles, primarily the one about being user friendly. Beyond ease of learning and ease of model building, Keras offers the advantages of broad adoption, support for a wide range of production deployment options, integration with at least five back-end engines (TensorFlow, CNTK, Theano, MXNet, and PlaidML), and strong support for multiple GPUs and distributed training. Plus, Keras is backed by Google, Microsoft, Amazon, Apple, Nvidia, Uber, and others.



How do you best talk to your board about cybersecurity?

Boards are maturing both in their interest in and understanding of cybersecurity. They are now asking much more specific questions, particularly as they wish to increase this understanding. In conducting this research, we had the pleasure of working with board members who have been privy to this security journey. We wanted to understand where the gap is for them and how we can help close it. One of the key problems in communicating security to any stakeholder group (including boards) is that we (security pros) assume that we know what our audience wants and proceed to throw information at them as per our desires. But because our expertise typically lies in the field of technology, not human psychology or communication, our assumptions about what they want are often far removed from reality. We rarely take the time to ask, for fear of appearing stupid. In this research, we did just that: We asked. As a result, we had the opportunity to understand board members’ journeys through the murky and often technical and confusing waters of cybersecurity.


The internet of human things: Implants for everybody and how we get there


Let me be clear on my motivation for wanting to make wearables -- and eventually implants -- our default method of brick-and-mortar payment: I hate physical wallets. I don't like dragging around a thick hunk of cow hide filled with a bunch of credit cards I don't use that often. Then, there are loyalty program cards and various IDs I have, such as my license, various types of permits, and medical insurance and drug plan stuff when I have to pick up prescriptions. Have you ever lost or had your wallet stolen? Or your keys? The amount of work it takes to get your life back in order is ridiculous. How many of you do the paranoid "life check" triple play every day for your wallet, keys, and smartphone? When I am traveling, I might do that three times a day, easy.  So now, let us imagine a future where you don't have to walk around carrying cow hide stuffed with plastic cards and cash. A future without losing wallets and the disruption that ensues. A future where many of us can leave our homes every day with literally nothing on our person except a smartphone and perhaps a wearable device.



Panasonic IoT strategy is all about big data analytics

It's all about pain points. What's the problem that you're trying to solve? Believe it or not, it may sound like an easy question, but the answers are really difficult. Because A, to get your middle management or middle-ranked individuals, to be able to speak to pain points is difficult because they see that as an admission of some guilt. Getting them out of that mode, and getting them into a comfort zone where they can openly talk about the pain points is really challenging. Because you can get a set of pain points from the top-level executives, but you need to let some level of granularity on those pain points. Without the granularity you're unable to pinpoint on the specifics and recommended a solution. So what we have done is, for instance, in our industrial manufacturing operations, we have people who walk into manufacturing floors, we talk to executives, we talk to engineers, and we take a third-party view on what the problems are, and identify these pains, and then try to prioritize what the return would be on those pain points.


Giving algorithms a sense of uncertainty could make them more ethical


The algorithm could handle this uncertainty by computing multiple solutions and then giving humans a menu of options with their associated trade-offs, Eckersley says. Say the AI system was meant to help make medical decisions. Instead of recommending one treatment over another, it could present three possible options: one for maximizing patient life span, another for minimizing patient suffering, and a third for minimizing cost. “Have the system be explicitly unsure,” he says, “and hand the dilemma back to the humans.” Carla Gomes, a professor of computer science at Cornell University, has experimented with similar techniques in her work. In one project, she’s been developing an automated system to evaluate the impact of new hydroelectric dam projects in the Amazon River basin. The dams provide a source of clean energy. But they also profoundly alter sections of river and disrupt wildlife ecosystems. “This is a completely different scenario from autonomous cars or other [commonly referenced ethical dilemmas], but it’s another setting where these problems are real,” she says. “There are two conflicting objectives, so what should you do?”


The Technical Case for Mixing Cloud Computing and Manufacturing

The movement in manufacturing needs to be around the growth of IaaS usage, including cloud-delivered servers, databases, data integration, and other core components needed to provide the types of services listed earlier in this article. AWS provides all of these components, as does Google and Microsoft.  That said, what keeps many manufacturing companies out of the cloud is the lack of skills and knowledge. It takes a specific skill set to properly integrate existing ‘some-time’ systems that provide no real-time visibility or automated responses with new cloud-based systems that provide the ability to operationalize new and existing data points. The objective is to provide a quick ROI, as well as the ability to move operations in more productive and less expensive directions.  The fundamentals are well understood and are becoming easier to understand by the manufacturing organizations. What’s missing is a stepwise approach that spells out the cloud conversion approach with enough detail to provide the company with a path to tactical and strategic success.


Beyond the Dashboard: How AI Changes the Way We Measure Business


Many BI companies see the potential of AI and have jumped on the bandwagon. Most today generate point-and-click automated insights that surface significant trends, anomalies, and clusters in the day, usually for a highly constrained data set, such as a chart or dashboard. The trick is to do this at scale and in real time. Most BI vendors don’t have the processing power to do that, let alone run it continuously in the background for multiple KPIs simultaneously. With automated insights, the dashboard becomes a jumping off point for obtaining deep insights about business processes. These insights might pop up above or below a KPI, or upon a click; or they might be encoded in text via a natural language generation tool that displays or speaks a deep analysis of the dashboard KPIs. ... FinancialForce applies Salesforce's Einstein AI engine to sales and financial data to generate rich, action-oriented views of customers. Its dashboards display color-coded indicators of customer health, and with a single click, an analysis of under-performing health indicators along with recommendations for improvement.


Developing Microservices with Behavior Driven Development & Interface Oriented Design


BDD involves the triad – the three perspectives of the customer, of the developer, and of the tester. It’s usually applied for the external behavior of an application. Since microservices are internal, the customer perspective is that of the internal consumer, that is, the parts of the implementation which uses the service. So the triad collaboration is between the consumer developers, the microservice developers, and the testers. Behavior is often expressed in a Given-When-Then form, e.g. Given a particular state, When an action or event occurs, Then the state changes and/or an output occurs. Stateless behavior, as business rules and calculations, just shows the transformation from input to output. Interface Oriented Design focuses on the Design Patterns principle “Design to interfaces, not implementations”. A consumer entity should be written against the interface that a producer microservice exposes, not to its internal implementation. These interfaces should be well defined, including how they respond if they are unable to perform their responsibilities. Domain Driven Design (DDD) can help define the terms involved in the behavior and the interface.


Google petitions Supreme Court to reconsider Android Java ruling


Walker said the court initially ruled that the software interfaces were not copyrightable, but that decision was overruled. “A unanimous jury then held that our use of the interfaces was a legal fair use, but that decision was likewise overruled,” he said. “Unless the Supreme Court corrects these twin reversals, this case will end developers’ traditional ability to freely use existing software interfaces to build new generations of computer programs for consumers.”  Walker added: “The US constitution authorised copyrights to “promote the progress of science and useful arts’, not to impede creativity or promote lock-in of software platforms.” In response to Walker’s post, Oracle executive vice-president and general counsel, Dorian Daley, wrote: “Google's petition for certiorari presents a rehash of arguments that have already been thoughtfully and thoroughly discredited. “The fabricated concern about innovation hides Google’s true concern: that it be allowed the unfettered ability to copy the original and valuable work of others as a matter of its own convenience and for substantial financial gain.



Securing the Internet of Things: Governments Action Likely in 2019

person taking a selfie in the lens of a security camera
The landscape is dotted with a few new laws and regulations, such as a California law requiring manufacturers of any devices that connect to the internet to include “reasonable” security features, including unique, user-set passwords for each device rather than generic default credentials that are easier for an intruder to discern. Some security experts, however, have criticized the law as too weak. Well-known consultant Robert Graham wrote, “it’s based on the misconception of adding security features. It’s like dieting …. The key to dieting is not eating more but eating less. The same is true of cybersecurity, where the point is not to add 'security features' but to remove ‘insecure features.’" That reaction shows there’s a lot more to be done. But it will be interesting to see just how aggressively governments push. Will they rely on stronger laws to force the industry to more effectively tackle IoT security? Or gentler approaches, like the United Kingdom’s government website that provides a voluntary code of practice?



Quote for the day:


"Remember teamwork begins by building trust. And the only way to do that is to overcome our need for invulnerability." -- Patrick Lencioni


Daily Tech Digest - January 27, 2018

There seems to be an obvious and somewhat necessary solution here to ensure that employees within an organisation are able to understand their IoT data and apply it to their own sector of expertise for maximum business benefit. One of the ways to resolve this skills shortage is to think about training Millennials to drive IoT projects forward in the future. Millennials are our future workforce and, given they are used to being constantly connected, they are perfectly placed to drive further connectivity. You’ll hear this described as entering the sharing economy. Therefore, as we enter this more circular economy, we need to equip employees with the necessary skills in AI, ML and deep learning (DL). By opening up the opportunity for individuals to specialise in these areas, businesses will be able to apply analytics to streaming data for deeper insights. This will enable more predictive decisions to be made and falls into sync with what a data scientist would be doing day by day. 


Blockchain Technology A Global Perspective!


The world is innovating without permission. The real essence of Blockchain Technology lies in innovating the Government based applications. The Deleware company corporation records stored on Blockchain, The Sweden operating now real-estate transaction on Blockchain, Singapore issuing invoicing into the Blockchain, The UK using Blockchain based monitors for distribution of grants. In Estonia, e-Citizens records, e-Payments key, medical records secured on Blockchain and the Ghana recording land registry using Blockchain Technology. Last but not the least. The most promising initiative of Smart Dubai — Dubai Blockchain Strategy. The Dubai is on the fast track to implement Blockchain in government operations and it will be the First City in the World to be completely powered by the Blockchain by 2020. After producing bunch of POCs and evaluating hundreds of Blockchain Innovations, with supporting tens of game changing startups around the globe. Now, sharing few easy to implement and potential Blockchain Use Cases to change the society in phenomenal way, all for public and private sectors. 


Is It Possible To Learn Data Science & Machine Learning Without Mathematics?


For machine learning, the real prerequisite skill that one needs to learn is data analysis, beginners and there is no need to know calculus and linear algebra in order to build a model that makes accurate predictions. The role of mathematics is particularly significant only if one is involved in machine learning research in an academic setting or for few subsets of more advanced data scientists. There are people in the industry at high levels who are also using advanced math on a regular basis. There are who are pushing the boundaries of machine learning people working on bleeding edge tools. People at companies like Google and Facebook are only ones who certainly use calculus, linear algebra, and more advanced math routinely in their work. The bottom line is that in industry, data scientists just don’t do much higher-level math but I reality they do is they spend a huge amount of their time getting data, cleaning data, and exploring data. The truth is that 80% of what people do is data munging and data visualization.


This Trojan infects Chrome browser extensions, spoofs searches to steal cryptocurrency

Different infection vectors are in place depending on the type of browser found on an infected system. Razy is able to install malicious browser extensions, which is nothing new. However, the Trojan is also able to infect already-installed, legitimate extensions, by disabling integrity checks for extensions and automatic updates for browsers. In the case of Google Chrome, Razy edits the chrome.dll file to disable extension integrity checks and then renames this file to break the standard pathway. Registry keys are then created to disable browser updates. "We have encountered cases where different Chrome extensions were infected," the researchers say. "One extension, in particular, is worth mentioning: Chrome Media Router is a component of the service with the same name in browsers based on Chromium. It is present on all devices where the Chrome browser is installed, although it is not shown in the list of installed extensions."


ML
Machine Learning on Code is actually a field of research that is just starting to materialize into enterprise products. One of the pioneers of movement is a company called source{d}, which is building a series of open source projects turning code into actionable data and training machine learning models to help developers respect technical guidelines. With every company quickly becoming a software company, intangible assets such as code represent a larger share of their market value. Therefore companies should strive to understand their codebase through meaningful analytic reports to inform engineering decisions and develop a competitive advantage for the business. On one hand, managers can use tools like the open source source{d} engine to easily retrieve and analyze all their Git repositories via a friendly SQL API. They can run it from any Unix system, and it will automatically parse their companies’ source code in a language-agnostic way to identify trends and measure progress made on key digital transformation initiatives.


Information theory holds surprises for machine learning

Information theory provides bounds on just how optimal each layer is, in terms of how well it can balance the competing demands of compression and prediction. "A lot of times when you have a neural network and it learns to map faces to names, or pictures to numerical digits, or amazing things like French text to English text, it has a lot of intermediate hidden layers that information flows through," says Artemy Kolchinsky, an SFI Postdoctoral Fellow and the study's lead author. "So there's this long-standing idea that as raw inputs get transformed to these intermediate representations, the system is trading prediction for compression, and building higher-level concepts through this information bottleneck." However, Kolchinsky and his collaborators Brendan Tracey (SFI, MIT) and Steven Van Kuyk (University of Wellington) uncovered a surprising weakness when they applied this explanation to common classification problems, where each input has one correct output (e.g., in which each picture can either be of a cat or of a dog). In such cases, they found that classifiers with many layers generally do not give up some prediction for improved compression.


The controversies of blockchain governance and rough consensus

On-chain governance describes the manner of proposing changes to a cryptocurrency and its underlying blockchain by a certain set of processes, rather than a simple majority consensus. The core differences between blockchains can be highlighted by examining exactly how these decisions are made, and by whom. To understand the concept, it’s important to identify all the participants in the network and how they work together. Miners are a core component in a decentralized public blockchain network because they help sustain it. They are incentivized by transaction fees and block rewards. Developers create the protocol and maintain the blockchain. They are also responsible for enforcing changes such as hard or soft forks. Like miners, developers are also incentivized to keep the network going. When developers propose a change to the network, a core group is tasked with achieving consensus over whether to accept or reject them. For miners, they back changes by contributing their hash power to one of the blockchains borne from a hard fork. 


Hacking enterprise architecture and service design


It is not very common for IT architects and service designers to work together. We are usually at the different ends of digital projects. We speak a different language, and often even physically sit in different buildings. As a service designer and enterprise architect, we had the unique opportunity to work together in various projects at D9. We began to identify strengths and weaknesses of each approach and expertise. There are clear commonalities, and in many ways our expertise complements each other. Government is transforming in the wider context of global and societal issues, complex systems and technological change. Our approach is that the traditional role and structures of government are challenged by new technology, human centred design and internal and external strategic drivers of change. To us, digital transformation is about human, strategy and technology. We work in transformation that requires multidisciplinary work in all the three elements. Throughout our work, we identified that a core challenge for digital transformation is a lack of active dialogue between strategy and development.


'Bitcoin will go to zero': Davos talks up the future of blockchain tech

A visual representation of the digital Cryptocurrency, Bitcoin, is seen on September 04 2018 in Hong Kong, Hong Kong.
Schumacher said the industry is now trying to create "open decentralized systems." These would essentially be next generation protocols or infrastructure that businesses could run on, similar to cloud computing today. The next generation of blockchain technology is currently being developed. Yeung said that she sees blockchain adoption happening quickly in the area of payments, particularly in Asia. "Many developing countries, where just to start with they don't even have credit cards, there's no particular infrastructure, it's almost easier to see sort of blockchain-enabled payments, to see in Asia, you will see more action happening in Asia more than U.S. and Europe," Yeung told CNBC. Ripple CEO Garlinghouse said he expects more widespread adoption of blockchain in about five years, while Schumacher said that it is three years off. However, Hutchins said that ultimately, consumers will not be talking about what blockchain is being used, they will just care how good the use case of a product is.


The future of code quality, security and agility lies in machine learning

code
Source code repository analysis can also reveal information about the developers writing it. Team dynamics can be highlighted by analyzing commits time and content: managers can identify when software engineers are the most productive, arranging meetings and encouraging cross-team collaboration accordingly. Looking at programming languages and frameworks trend can inform hiring managers on what type of talent to hire and what upskilling education resources can they provide. Adding source code as a new dataset in enterprises’ data warehouses and visualization platforms such as Power BI, Looker or Tableau will provide everyone in the engineering organization with a whole new level of source code and development process observability.  Yet the most exciting aspect of looking at code as a dataset is that it can be used to train Machine Learning models that can automate many different repetitive tasks for developers. We’re already starting to see new machine learning based applications for assisted code review or suggestions on GitHub.



Quote for the day:


"It is easier to act yourself into a new way of thinking, than it is to think yourself into a new way of acting." -- A.J. Jacobs


Daily Tech Digest - January 26, 2019

AI is sending people to jail—and getting it wrong


Police departments use predictive algorithms to strategize about where to send their ranks. Law enforcement agencies use face recognition systems to help identify suspects. These practices have garnered well-deserved scrutiny for whether they in fact improve safety or simply perpetuate existing inequities. Researchers and civil rights advocates, for example, have repeatedly demonstrated that face recognition systems can fail spectacularly, particularly for dark-skinned individuals—even mistaking members of Congress for convicted criminals. But the most controversial tool by far comes after police have made an arrest. Say hello to criminal risk assessment algorithms. Risk assessment tools are designed to do one thing: take in the details of a defendant’s profile and spit out a recidivism score—a single number estimating the likelihood that he or she will reoffend. A judge then factors that score into a myriad of decisions that can determine what type of rehabilitation services particular defendants should receive, whether they should be held in jail before trial, and how severe their sentences should be. 


Almost 90 percent of the business leaders surveyed as part of the study believed that cognitive diversity in the workplace is extremely important for running a successful organization. Managers in the contemporary workplace want employees to think differently and experiment with their typified ways of problem solving. While expecting such cognitive diversity was a bit difficult in the past, the role AI can play in the workforce means that organizations can expect greater rewards in the future. AI mechanisms will help augment human efforts in the workplace and stimulate cognitive diversification that benefits the organization. The study also revealed that 75 percent of respondents expected AI to create new roles for employees. This is a clear indication that AI is not going to replace human jobs, but will instead increase efficiency and shift humans’ roles and even create new positions for employees that provide meaningful work better suited to humans’ strengths.


Mondelez vs. Zurich: How watertight is cyber insurance coverage?

Mondelez vs. Zurich: How watertight is cyber insurance coverage? image
To put it bluntly, it appears the insurance sector has not been able to keep up with cyber threats. As new threats pop-up in cyberspace, new policies typically lag behind in a confused state. A lack of visibility of their client’s cyber health also challenges insurers. This is very important for insurers, for example, if somebody wants health insurance, proving whether or not they smoke or that there’s no hereditary diseases which run in their family is vital in establishing how much their premium should be. The visibility issue isn’t just one affecting insurers. Many firms don’t have the tools to adequately assess and respond to the rising levels of cyber risk they’re exposed to. A recent report from the insurer Hiscox claimed that nearly three-quarters (73%) of global firms are “cyber-novices” when it comes to the quality and execution of their security strategy. If it’s the case (and it is) that cyber insurance policies are confusing and have room for improvement, the best thing a company can do is first to understand the cyber risks they face, and then secure a bespoke policy to meet their needs.



Collateral Damage: When Cyberwarfare Targets Civilian Data

Unfortunately, this is par for the course for private-sector businesses and NGOs. Sometimes the breach is to get a critical piece of political or military information to be used later. Sometimes it's to steal intellectual property or research so that the hacking nation can get a competitive boost in the economic and/or military might. Sometimes it's to cull some personal information about someone with the right security clearance — which may mean orchestrating a super-breach, compromising several million other accounts along the way. Notably, these breaches aren't about anything so pedestrian as identity theft or credit card fraud. Instead, the goal is to use the information gleaned as a jumping-off point — to allow escalated access to yet more critical information. This is especially the case with healthcare organizations, where the right juicy health-record tidbit about a well-placed employee (or family member thereof) of a government arm can be used to extort some small amount of extra information or escalated access, turning that employee into an inside-attack threat.


How AI and Quantum Computing May Alter Humanity’s Future


König and the AI research team showed that quantum outperforms classical computing and that quantum effects can “enhance information-processing capabilities and speed up the solution of certain computational problems.” In their research, the team demonstrated that parallel quantum algorithms running in a constant time outperform classical computers. The scientists showed that quantum computers only required a fixed number of steps for problem solving and was better at “solving certain linear algebra problems associated with binary quadratic forms.” Forward-thinking organizations recognize the synergistic boost that the combination of quantum computing and artificial intelligence may herald. Microsoft CEO Satya Nadella stated in a WSJ Magazine interview, “What’s the next breakthrough that will allow us to keep up this exponential growth in computing power and to solve problems—whether it’s about climate or food production or drug discovery?


Bringing open-source rhyme and reason to edge computing: LF Edge

This isn't easy. Interoperability and standards simply don't exist in IoT or Edge Computing. This makes life miserable for anyone working in these areas. It's the LF Edge's founders hope that this pain will bring vendors, OEMs, and developers together to create true open standards. For the broader IoT industry to succeed, the fragmented edge technology players must work together to advance a common, constructive vision. Arpit Joshipura, the Linux Foundation general manager for Edge and IoT, said, "In order for the broader IoT to succeed, the currently fragmented edge market needs to be able to work together to identify and protect against problematic security vulnerabilities and advances common, constructive vision for the future of the industry. LF Edge is realizing this vision with five projects. These support emerging Edge applications in non-traditional video and connected things that require lower latency (up to 20 milliseconds), faster processing, and mobility.


Balancing data privacy with ambitious IT projects for digital transformation

Balancing data privacy with ambitious IT projects image
A global organisation that produces medical devices for the healthcare market used IoT technology to monitor and record the usage of every individual device for product development and preventative maintenance. Regardless of the relatively benign purpose, because of the nature of these medical devices and the broad approach to data collection, the usage data that the developers were collecting was inherently sensitive. Healthcare data is classified as “special category” data by GDPR as well as others, which brings with it additional prohibitions over its use and heightened penalties for its mishandling. More concerning was that neither the patients, the healthcare professionals nor the business were aware of the collection and use of the data. No framework was in place to govern its collection, use or storage. No processes were documented. Furthermore, the business had not yet appointed a data protection officer. Once the legal teams began their GDPR preparations, they quickly discovered this data use.


26 Regulatory Initiatives that Will Shape Fintech in Europe and Beyond

In the banking industry’s quest towards open banking, standardisation has now become the name of the game towards global applicability. There is a consistent push and pull between whether these standards should come from regulators or industry players. On one hand, regulators can future-proof standards in that they could design the standards based on principles that ensure safety in the ecosystem. On the other hand, industry players may be better suited to producing standards or platforms that could better encourage innovation and growth of the industry as they are often instrumental in making it happen. Many of the standards listed below only apply to one region or another, but as the interchange fee regulation in the EU being implemented in Australia shows, there is something to be said about the ripple effect of regulations, particularly when regulators attempt to implement what works in other countries. The following is a list of initiatives, regulations and standards that have been listed in the World Payments Report 2018, by Capgemini and BNP Paribas.


With cybersecurity threats looming, the government shutdown is putting America at risk

net-neutrality-capitol
Employees who are considered “essential” are still on the job, but the loss of supporting staff could prove to be costly, in both the short and long term. More immediately, the shutdown places a greater burden on the employees deemed essential enough to stick around. These employees are tasked with both longer hours and expanded responsibilities, leading to a higher risk of critical oversight and mission failure, as weary agents find themselves increasingly stretched beyond their capabilities. The long-term effects, however, are quite frankly, far more alarming. There’s a serious possibility our brightest minds in cybersecurity will consider moving to the private sector following a shutdown of this magnitude. Even ignoring that the private sector pays better, furloughed staff are likely to reconsider just how valued they are in their current roles. After the 2013 shutdown, a significant segment of the intelligence community left their posts for the relative stability of corporate America. The current shutdown bears those risks as well. A loss of critical personnel could result in institutional failure far beyond the present shutdown, leading to cascading security deterioration.


Three reasons why you need to modernise your legacy enterprise data architecture

null
Most data was of a similar breed in the past. By and large, it was structured and easy to collate. Not so today. Now, some data lives in on-premises databases while other data resides in cloud applications. A given enterprise might collect data that is structured, unstructured, and semi-structured. The variety keeps widening.  According to one survey, enterprises use around 1,180 cloud services, many of which produce unique data. In another example, we integrated over 400 applications for a major enterprise IT firm. The process of integrating all this wildly disparate data alone is too great a task for legacy systems. Within a legacy data architecture, you often have to hand-code your data pipelines, which then need repairing as soon as an API changes. You might also have to oversee an amalgam of integration solutions, ranging from limited point-to-point tools to bulky platforms that must be nurtured through scripting. These traditional approaches are slow, fraught with complexity, and ill-matched for the growing variety of data nowadays.



Quote for the day:


"It is easy to lead from the front when there are no obstacles before you, the true colors of a leader are exposed when placed under fire." -- Mark W. Boyer