Daily Tech Digest - July 02, 2019

TIN coalition calls for industry action against cyber fraud


The vision for overcoming social engineering challenges is to reduce the opportunities to establish false trust and to ensure that all remaining threats are well publicised and understood. The vision also requires organisations to interact with customers and staff in a way that reinforces security and to ensure that the security of interactions with individuals becomes less dependent on public information. To address operating in silos, the vision is to ensure that cyber fraud is understood across functions within and between organisations, to ensure that organisations are recognised for sharing useful information, not punished for suffering an attack, and to ensure that business and law enforcement collaborate effectively to tackle cyber fraud. And to reduce the gap between cyber security and anti-fraud operations, the vision is to ensure that the response to cyber attacks minimises the broader impact of data loss on society, that fraud teams in business and law enforcement are fully engaged in tackling cyber attacks as a precursor to fraud, that enforcement is globalised to tackle all forms of cyber fraud


Big Data Is Dead. Long Live Big Data AI.

Getty
“The value of the data analytics market can’t be ignored. The Looker and Tableau acquisitions demonstrate that even the biggest tech players are snapping up data analytics companies with big price tags, clearly demonstrating the value these companies have in the larger cloud ecosystem. And in terms of what this means for the evolution of AI, we’ve reached a point where we have more than enough anonymized data to train the system, and now it’s a matter of honing how we use the AI to extract the maximum value from data”—Amir Orad, CEO, Sisense “The Google Cloud/Looker and Salesforce/Tableau acquisitions are a direct reaction to the rate at which analytics workloads have been shifting to the cloud over the past few years. The state of AI is a reflection of this shift as machine learning, AI and analytics have become the primary growth opportunities for the cloud today. Yet, it's this same growth that is causing barrier to success as AI project overwhelming face the same problem -- data quality”—Adam Wilson, CEO, Trifacta


What can you do with the Microsoft Graph?


Working with the APIs can be tricky; it can be hard to construct the right query, especially if you're looking for more complex graph queries. Microsoft offers tools to help build and test queries, as well as SDKs that can simplify adding Graph support to your apps. One, the web-based Graph Explorer, allows you to try out queries without logging in to an Office 365 account. It provides sample queries that show how to extract specific information from the service, with a library of different queries to get started. You can only use GET queries against sample data; POST requires your account details and your data. Once you're ready to start working with live data, you can log in with a Microsoft account, and start using your Microsoft 365 tenant. The list of query categories is long, covering working with users, with mail and calendar, as well as files and apps. The Graph Explorer doesn't only show production queries, it supports beta APIs, so you can experiment before adding them in your code. Queries can be cut-and-pasted from the Explorer, and you can see any request headers or bodies that need to be constructed and delivered with the REST HTTP query.



Offensive Security launches OffSec Flex, a new cybersecurity training program

Organizations can now use OffSec Flex to purchase blocks of Offensive Security’s industry-leading practical, hands-on training, certification and virtual lab offerings, allowing them to proactively increase and enhance the level of cybersecurity talent available within their organizations. With Offensive Security’s hands-on courses, labs and exams readily available, organizations are able to offer educational opportunities to new hires and non-security team members alike, improving their security posture and equipping their employees with the adversarial mindset necessary to protect modern enterprises from today’s threats. “Cybersecurity training is not just for security professionals anymore,” said Kerry Ancheta, VP of Worldwide Sales, Offensive Security. “Increasingly we see organizations recommend pentest training courses for their software development or application security teams in order to improve their understanding for how their systems and applications are attacked.


Calculating The Cost of Software Quality in Your Organization


Basically, the costs of software quality (COSQ) are those costs incurred through both meeting and not meeting the customer’s quality expectations. In other words, there are costs associated with defects, but producing a defect-free product or service has a cost as well. Calculating these costs serves the purpose of identifying just how much the organization spends to meet the customer’s expectations, and how much it spends (or loses) when it does not.  Knowing these values allows management and team members across the company to take action in ensuring high quality at a lower cost. While analyzing the COSQ at an organization may lead to the revelation of uncomfortable truths about the state of quality management at the company, the process is important for eliminating waste associated with poor quality. This often requires a mindset and culture shift from viewing software quality defects as individual failures to seeing them as opportunities to improve as a collective team.


Machine learning has been used to automatically translate long-lost languages


It’s not hard to imagine that recent advances in machine translation might help. In just a few years, the study of linguistics has been revolutionized by the availability of huge annotated databases, and techniques for getting machines to learn from them. Consequently, machine translation from one language to another has become routine. And although it isn’t perfect, these methods have provided an entirely new way to think about language. Enter Jiaming Luo and Regina Barzilay from MIT and Yuan Cao from Google’s AI lab in Mountain View, California. This team has developed a machine-learning system capable of deciphering lost languages, and they’ve demonstrated it by having it decipher Linear B—the first time this has been done automatically. The approach they used was very different from the standard machine translation techniques. First some background. The big idea behind machine translation is the understanding that words are related to each other in similar ways, regardless of the language involved.


The Agile Manifesto: A Software Architect's Perspective

Specifications with an architectural impact (in the form of new user stories) should be tracked by the architect and assessed in a pragmatic approach by the whole development team, including experienced developers, test engineers, and devops. Bad habits from the past, when the architect created on paper the full blown technical design for the team, do not fit within modern agile environments. There are multiple flaws with this model, which I also faced in my daily basic work. First and most important, the architect might be wrong. This happened to me after I created a detailed upfront technical design and presented it to development team during Sprint refinements. I got questions related to cases I did not think about or I failed to take into account. In most of the cases, it turned out the initial design was either incomplete or impractical, and required extra work. Big upfront design limits the creativity and autonomy of the team members, since they must follow a recipe which is already granted. From a psychological standpoint, even the author might become biased and more reluctant to change it afterwards, trying to prove it is correct rather than to admit its flaws.


Essential tips for scaling quality AI data labeling


Data scientists are using labeled data and natural language processing (NLP) to automate legal contract review and predict patients who are at higher risk of chronic illness. The success of these systems depends on skilled humans in the loop, who label and structure the data for machine learning (ML). High-quality data yields better model performance. When data labeling is low quality, an ML model will struggle to learn. According to a report by analyst firm Cognilytica, about 80 percent of AI project time is spent on aggregating, cleaning, labeling, and augmenting data to be used in ML models. Just 20 percent of AI project time is spent on algorithm development, model training and tuning, and ML operationalization. These tasks are at the heart of AI development and require strategic thinking, along with a more advanced set of engineering or computer science skills. It’s best to deploy more expensive human resources — such as data scientists and ML engineers — on tasks that require expertise, collaboration, and analytical skills.


Effective or Not? The Real Impact of GDPR


The General Data Protection Regulation wasn’t just meant to give governments the means to enforce data security rules. Another key objective was to change how both companies and users behave when it comes to ensuring personal data remains private and protected. In this sense, GDPR seems to have had the desired impact. ... Another interesting fact the data shows is that users may have moved some of their own responsibility to GDPR enforcers. Two indicators led to this observation: “Respondents are less likely to read privacy statements than they were in 2015 (-7 percentage points) “17% say it is enough for them to see the website has a privacy policy” so they choose not to read the document at all. A similar behavior pattern emerges when dealing with social media usage. Less users – 56% in 2019 vs 60% in 2015 – actually change their privacy settings for their personal profile. The three most common reasons social network users give for not trying to change their personal profile’s default settings are that they trust the sites to set appropriate privacy settings (29%) that they do not know how to (27%), or that they are not worried about sharing their personal data (20%).


5 steps for digital workplace transformation


Start by recognizing actionable opportunities within your business operations. Approach the prospects for digital transformation from a business instead of technology perspective. Line-of-business (LOB) teams should lead this effort, coordinating closely with senior IT staffers to identify critical barriers to success. Of course, each organization faces its own set of challenges. But, at the onset, step back and identify key themes -- accelerating innovation, enhancing productivity, improving governance or reshaping the steps in the customer journey -- that make good business sense. Consider operations as a whole, while focusing on people and processes, and determine your target audiences: employees, partners and/or customers. Then, engage a cross section of these audiences in conversations about what they are doing and how they understand the underlying business purposes. Develop both the technology and the business insights about what is happening from the participants' perspectives. Listen carefully as they describe their tasks, and be sure to observe how they do their work to determine where bottlenecks occur.



Quote for the day:


“The real voyage of discovery consists not in seeking new landscapes but in having new eyes.” -- Marcel Proust


Daily Tech Digest - July 01, 2019

Automation Is Becoming A C-Suite Priority

Automation is becoming a C-Suite priority - CIO&Leader
While automation maturity is at its highest in the US, with over 60% of organizations making extensive use of automation, there are some interesting findings from India. The country shows the maximum level of enthusiasm about automation among CIOs and other senior executives. 84% believe RPA is a high or essential priority to meet strategic business objectives for Indian businesses as against the global average of 76%. Also 90% C-level executives expect their company’s financial results to improve as a result of automation, namely profitability, operating costs and revenue growth. Sector wise, IT and manufacturing have outpaced other industries in automating business processes. By contrast, government and public sector institutions have made the least headway among surveyed sectors. Of CIOs who have implemented automation, most have automated highly repetitive back-office functions. “Automation of functions is most extensive in IT, operations and production, customer service and finance.



FTC data privacy enforcement will threaten corporate bottom lines


Despite the mounting concerns over data security and privacy practices that put consumers’ data at risk, the U.S. Congress still has yet to adopt national legislation to address cybersecurity, and security spending will see a nominal increase given the current administration’s recent budget proposal. Consequently, organizations are subject to a patchwork of laws and regulations relevant to cybersecurity and privacy practices, including differing laws and regulations in each state and the District of Columbia, as well as from multiple federal administrative agencies. Therefore, the FTC has taken a comprehensive directive to extend its supervision over all companies operating in the United States. In fact, the FTC has assumed a leading role in policing corporate cybersecurity practices since 2002. Since that time, it brought more than 200 cases against companies for unfair or deceptive practices that endangered the personal data of consumers.



Huge jump in cyber incidents reported by finance sector


Overall, Snaith said there remain serious vulnerabilities across some financial services businesses when it comes to the effectiveness of their cyber controls. “More needs to be done to embed a cyber resilient culture and ensure effective incident reporting processes are in place,” he said. UK law enforcement is also calling for improvements in cyber crime reporting. “It is crucial that businesses report cyber crime to us because every incident is an investigative opportunity,” Rob Jones, director of threat leadership at the UK National Crime Agency (NCA) told Computer Weekly. “Failure to report creates an unpoliced space and a situation where incident response companies just sweep up the glass, but don’t deal with the underlying issue, which emboldens criminals,” he said. “As a result, the problem will continue and prevalence, severity and sophistication of attacks will increase.” Nigel Hawthorn, data privacy expert at security firm McAfee said that it is widely recognised that cyber incidents were previously under-reported. 


How does the CVE scoring system work?

securityhero.jpg
The first thing to understand is that there are three types of Metrics used in this system: Base Score Metrics - depends on sub-formulas for Impact Sub-Score (ISS), Impact, and Exploitability; Temporal Score Metrics is equal to a roundup of BaseScore * EploitCodeMaturity * RemediationLevel * ReportConfidence; and Environmental Score Metrics - depends on sub-formulas for Modified Impact Sub-Score (MISS), ModifiedImpact, and ModifiedExploitability. The formula for this is Minimum ( 1 - [ (1 -ConfidentialityRequirement * ModifiedConfidentiality) * (1 - IntegrityRequirement × ModifiedIntegrity) * (1 - AvailabilityRequirement * ModifiedAvailability) ], 0.915). Within each set of metrics are the following sub-categories: Base Score Metrics: Attack Vector, Attack Complexity, Privileges Required, User Interaction, Scope, Confidentiality Impact, Integrity Impact, Availability Impact; Temporal Score Metrics: Exploitability, Remediation Level, Report Confidence; and  Environmental Score Metrics: Attack Vector, Attack Complexity, Privileges Required, User Interaction, Scope, Confidentiality Impact, Integrity Impact, Availability Impact, ...


AI is changing the entire nature of compute


"Hardware capabilities and software tools both motivate and limit the type of ideas that AI researchers will imagine and will allow themselves to pursue," said LeCun. "The tools at our disposal fashion our thoughts more than we care to admit." It's not hard to see how that's already been the case. The rise of deep learning, starting in 2006, came about not only because of tons of data, and new techniques in machine learning, such as "dropout," but also because of greater and greater compute power. In particular, the increasing use of graphics processing units, or "GPUs," from Nvidia, led to greater parallelization of compute. That made possible training of vastly larger networks than in past. The premise offered in the 1980s of "parallel distributed processing," where nodes of an artificial network are trained simultaneously, finally became a reality.  Machine learning is now poised to take over the majority of the world's computing activity, some believe. During that ISSCC in February, LeCun spoke to ZDNet about the shifting landscape of computing. 


SOLID Principles: Interface Segregation Principle (ISP)

Image 2 for SOLID Principles: Interface Segregation Principle (ISP)
A great simple definition of the Interface Segregation Principle was given in the book you have already heard of, “Agile Principles, Patterns, and Practices in C#”. So, the definition is:“The Interface Segregation Principle states that Clients should not be forced to depend on methods they do not use.” ... Here is an interesting historical note about the ISP. I’m pretty sure that ISP was first used long ago before Robert Martin, but the first public formulation belongs to Robert C. Martin. He applied the ISP first time while consulting for Xerox. Xerox had created a new printer system that could perform a variety of tasks such as stapling and faxing. The software for this system was created from the ground up. As the software grew, making modifications became more and more difficult so that even the smallest change would take a redeployment cycle of an hour, which made development nearly impossible. The redeployment cycle took so much time because at that time there were no C# or Java, these languages compile very fast. What we can’t say about C++ for example. Bad design of a C++ program can lead to significant compilation time.


Mist Wi-Fi no longer just cloud


The Mist Edge hardware appliance avoids having access points (APs) in each office on a campus communicate directly with Mist's WxLAN technology in the cloud. Instead, WxLAN policies created through Mist's cloud-based dashboard are stored in the on-premises appliance. WxLAN policies assign resources, such as servers and printers, to groups of users. Network managers can also create a service set identifier for a select group of users and assign services or devices that only they can access. Mist Edge is available only as a stand-alone appliance. Mist plans to ship the appliance's software on a virtual machine this year. Mist Edge reflects the preference of some enterprises to split management technology between the cloud and on premises. Companies more comfortable with an on-site WLAN controller, for example, could switch to Mist Edge, said Brandon Butler, an analyst at IDC. "Overall, we see more and more enterprises gaining comfort with managing their WLAN environments from the cloud but giving customers a choice in how to manage their environments is always good," he said.


Beyond Limits: Rethinking the next generation of AI

A human profile containing digital wireframe of technology connections.
Beyond Limits evolved out of work with NASA's Jet Propulsion Laboratory (JPL) for remote rovers used to explore places like the moon and Mars. Due to the communications lag in space, real-time control is virtually impossible. Any AI solution must be not only fully autonomous, it must be able to train and, ideally, correct itself. When there is a problem it can’t correct, the bandwidth limitations for communication make full reprograming problematic…but point patches are certainly possible. This resulted in an AI platform uniquely able to be updated, modified and, to a certain and initially limited extent, able to both teach itself and make corrections while disconnected. This unusual requirement likely has made the resulting AI nearly ideal for areas where the AI must often act independent of oversight – and/or in areas where problems can escalate very rapidly – and the AI must be able to both deal with a diversity of known and unknown issues. ... Although still its infancy, Beyond Limits represents a new class of AI. It’s better enabled to operate fully autonomously, it can both learn on the fly and increasingly make corrections to its own programing


HPE promises 100% reliability with its new storage system

HPE promises 100% reliability with its new storage system
Primera was announced last week at HPE’s Discover event in Las Vegas. Phil Davis, chief sales officer for HPE, said in the announcement keynote, “If you think about traditional storage, it’s full of compromises and complexity. Do I want fast or reliable? Do I want agility or simplicity? But not any more. We’re going to combine the simplicity of Nimble with the intelligence of Infosight and mission-critical heritage of 3Par and we’ve created a new class of storage that eliminates the traditional compromises and truly redefines what is possible with storage.” Davis said Primera will run out of the box with just a few cable connections and be can be autoprovisioning storage within 20 minutes. That means no need for IT consultants to install and configure the hardware. The more workloads you add to a storage system, the more unpredictable latency becomes. Using InfoSight’s parallelism, Primera improved throughput and latency of an Oracle database by 122% over the prior storage system, which HPE did not identify.


Using AI-powered intelligent automation for digital transformation success

A maturity model assessment begins with evaluating automation readiness from a technology and process perspective. IT should be involved in the discussion early on because they understand how automation technologies will fit within the larger IT framework. They’re also responsible for managing the environments that these technologies operate in and for ensuring proper security protocols are followed throughout the deployment process. From a business process and operations standpoint, organizations should assess how well-documented current processes are during this stage. If there’s room to improve prior to automation, this presents an opportunity to make upfront investments in this respect. Automation is most powerful when deployed against processes that are already running properly; it isn’t intended to ‘fix’ or alleviate the pain points around broken processes. In other words, optimize first and then automate for the best results.



Quote for the day:


"One of the sad truths about leadership is that, the higher up the ladder you travel, the less you know." -- Margaret Heffernan


Daily Tech Digest - June 30, 2019

How a quantum computer could break 2048-bit RSA encryption in 8 hours


Shor showed that a sufficiently powerful quantum computer could do this with ease, a result that sent shock waves through the security industry. And since then, quantum computers have been increasing in power. In 2012, physicists used a four-qubit quantum computer to factor 143. Then in 2014 they used a similar device to factor 56,153. It’s easy to imagine that at this rate of progress, quantum computers should soon be able to outperform the best classical ones. Not so. It turns out that quantum factoring is much harder in practice than might otherwise be expected. The reason is that noise becomes a significant problem for large quantum computers. And the best way currently to tackle noise is to use error-correcting codes that require significant extra qubits themselves. Taking this into account dramatically increases the resources required to factor 2048-bit numbers. In 2015, researchers estimated that a quantum computer would need a billion qubits to do the job reliably. That’s significantly more than the 70 qubits in today’s state-of-the-art quantum computers.



How Urbanhire is disrupting HR in Indonesia

Specifically, the hiring platform allows companies to post jobs across more than 50 portals, including Google, LinkedIn and Line - a freeware app which became Japan’s largest social network in 2013. Tapping into a pool of more than one million active jobseekers, the software-as-a-service (SaaS) follows a “data-driven hiring strategy”, aligning businesses to a four-step digital strategy of “source, assess, recruit and on-board”. Three years since launching, key customers include global brands such as AIA, Zurich and The Body Shop, in addition to Indonesian organisations like Danamon, Pertamina and Djarum. “Indonesia is a fantastic opportunity given where it is at from a growth perspective,” Kamstra added. “As a tech entrepreneur, I love the fact that we can use business models that have been successful in more developed countries without a lot of the baggage that comes with historical tech implementations that are no longer sufficient. “I love to use the telecom industry as an example. Indonesia was able to go from little infrastructure to a very modern one by not having gone through all the investment steps that countries like the US were forced to do as pioneers.


Don't Miss These 10 Cybersecurity Blind Spots

uncaptioned
When an employee is terminated, it’s important to shut down their access to all work-related accounts — immediately. Ideally, you might want to try to automate as much of the account-termination process as possible and ensure that the process covers all accounts for all employees. This can be easier said than done, but it's important to get a process or automated solution nailed down before that employee's access causes an unwanted breach. ... Any application that uses third-party software components, including open-source components, takes on the risk of potential vulnerabilities in those dependencies. These vulnerabilities should be identified, tracked and accounted for in the same way as every other software component. ... Service accounts are used by machines, and user accounts are used by humans. The trouble with service accounts is that sometimes they have access to a lot of different systems, and their passwords aren’t always managed well. Poorly managed passwords make for easy compromise by attackers.


Business needs to see infosec pros as trusted advisers

The first issue clouding communication between security professionals and the board or senior business leaders is the misunderstanding that IT risk is separate from business risk. Nothing could be further from the truth, especially considering that in most organisations today, the separation between what is IT and what is business is hard to identify because technology is the backbone of everything the business does. The second issue relates to how the message is packaged. Is the language full of technical jargon, or is it simple to understand and gets the message across in business terms? Does it highlight the loss to the business in terms understood by the board and senior business leaders? Take the example of when business downtime is required when a patch needs to be applied. Instead of talking in terms of the technical threats and the outcomes of poor patching, security professionals would be more effective explaining it in terms of loss to the business, such as lost opportunities or losses from an attack that may occur because of the unpatched status.


MongoDB CEO explains where the company has an edge over database giant Oracle


Cramer noted that Oracle, which has a nearly $195 billion market cap, has recently bought back billions in stock and has a big war chest. Despite that, MongoDB's architecture sets the younger company apart, Ittycheria said. The firm's database is built for the modern world, he added. "[Oracle] built an architecture designed in the late '70s for the world then, and they just tried to make it better over time," he said. "We built an architecture design for today's high performance mobile cloud computing world." Ittycheria explained how MongoDB helped Cisco address an order management application issue in which they receive tens of billions of orders from different sales channels a year. The platform serves more than 14,000 customers, including some of the most "sophisticated, demanding customers in the world." The list ranges from big media to telecom to gaming media to financial services, he said. Start-ups are also developing their business on MongoDB, Ittycheria said.


Fix your cloud security

Fix your cloud security
Enterprises are either not willing to use the right technology, or they don’t understand that the technology exists. It’s not that the database is unencrypted, it’s that nobody has any idea how to turn on encryption in flight or at rest. Also at fault are the “it was not on-premises” folks out there. They cling to the fact that since some security feature was not a part of the original on-premises system, it shouldn’t be needed in the cloud. The time to deal with security issues is when you move from on-premises to the public cloud. You need to spend at least a couple of weeks looking at identity access management, encryption, auditing, proactive security, and more, and then evaluating its viability to your enterprise. Otherwise, you could miss the cloud security boat as you make the migration.  In my opinion, this is the single most important step in migration. It allows you to reflect on what your security needs really are and how to solve them using cloud computing technology which, these days, is better than anything you can find on-premises.


Can Apple compete on privacy?


Apple's privacy campaign has already had an impact in terms of forcing the competition to pay closer attention to their disclosure and controls. It is unlikely to move the needle in terms of market share, but Apple can only gain as awareness of the great data tradeoff of targeted advertising grows and missteps in executing it continues. It should also be more effective as a retention tool for anyone who has not already been locked into Apple's milieu through its self-reinforcing portfolio of devices and growing family of services. Furthermore, while the smartphone market is mature, whatever challenges it as an emerging platform will likely raise even more profound privacy concerns. Already, wearables measure our pulse and assess whether we've fallen, and the kind of personal data that could be generated by measuring exactly what you're looking at via augmented reality gear could make smartphone-generated data seem crude by comparison. And there's another potential benefit to Apple's privacy campaign, one that the company has developed since it first stepped up its advocacy.


Serverless: applications only when you need them - no more, no less

Traditional IT architectures use a server infrastructure, whether on-premises or cloud-based, that requires managing the systems and services required for an application to function. The application must always be running, and the organization must spin up other instances of the application to handle more load which tends to be resource-intensive. Serverless architecture focuses instead on having the infrastructure provided by a third party, with the organization only providing the code for the applications broken down into functions that are hosted by the third party. This allows the application to scale based on function usage and is more cost-effective since the third-party charges for how often the application uses the function, instead of having the application running all the time. ... Serverless computing is constrained by performance requirements, resource limits, and security concerns, but excels at reducing costs for compute. That being said, where feasible, one should gradually migrate over to serverless infrastructure to make sure it can handle the application requirements before phasing out the legacy infrastructure.


Four Myths of Digital Transformation: What Only 8% of Companies Know


New research by Bain & Company finds that only 8% of global companies have been able to achieve their targeted business outcomes from their investments in digital technology. Said another way, more than 90% of companies are still struggling to deliver on the promise of a technology-enabled business model. What secret formula do the 8% deploy? Unsurprisingly, there are no shortcuts or silver bullet. But successful transformations do share some common themes. One of the most important is understanding that this is really a business transformation, supported by investments in new technology—not new technology in search of opportunities. Many executives pay lip service to this idea, but in practice, they delegate too much responsibility to the tech team, hoping the business can watch from the sidelines. At the 8%, executive teams understand that the core of a digital transformation is a business transformation, changing the way of engaging customers across channels, simplifying business processes, and redesigning products or services.


“We need to up our game”—DHS cybersecurity director on Iran and ransomware

Both the Iranian malicious activities and ransomware attacks are largely dependent on exploiting the same sorts of security issues. Both rely largely on the same tactics: malicious attachments, stolen credentials, or brute-force credential attacks to gain a foothold on targeted networks, usually using readily available malware as a foothold to use those credentials to then move across a network. When asked if the recent ransomware attacks on cities across the US (including three recent attacks in Florida with dramatically larger ransom demands) were indicative of a new, more targeted set of campaigns against US local governments, Krebs said that the attacks were likely not targeted—at least not initially. "I still think these [ransomware campaigns] are fairly expansive efforts, where [the attackers] are initially scanning, looking for certain vulnerabilities, and when they find one that's when they start to target," he said. "Again, I'm not sure we have the information right now saying they were specifically targeted.



Quote for the day:


"Leaders stuck in old cow paths are destined to repeat the same mistakes. Change leaders recognize the need to avoid old paths, old ideas and old plans." -- Reed Markham


Daily Tech Digest - June 29, 2019

India gears up for historic data protection law

India gears up for historic data protection law
India is getting ready for the law after the Narendra Modi government listed it as one of the bills in the Parliament last week, in the first session after the general elections. The election in April gave Modi a second term with a massive majority. The bill will create a legal regime for how data can be shared, stored and used in India. The proposed law, once passed by Parliament, will have major consequences for technology companies hoping to build businesses that access user data. Most technology companies like Google, Facebook and others now thrive on data generated by their users to earn billions of dollars worldwide. Mukesh Ambani who heads Reliance Industries, India’s biggest firm working in the energy, telecommunications and retail space, pointed out in January this year that “data is the new oil“. However, while the cabinet is yet to clear the final draft of the data protection bill, technology and privacy experts are concerned about the implications of the proposed new law. Many companies, including the technology giants hoping to tap into India’s massive markets, are apprehensive about what this entails for their core business models.



Hands on a screen for biometric identity access
“You have to be careful when there is sensitivity around personal data,” Kampman said. Whether it’s AI or any identity-related effort, “you need governance over this to be clear about what can be used and what can’t be used for a given purpose. You are a custodian of data and when you aggregate that data your responsibilities increase exponentially.” Broadly, the looking-before-leaping paradigm is in full force here. As government IT leaders and their business-line peers seek to better manage access and identity in an emerging cloud-driven enterprise, they’ll need to be thoughtful not just about the how, but about the why behind their efforts. “There needs to be a strategy,” Kampman said. “What is the outcome going to be? The technology world can solve these problems but it needs to be done with a viewpoint toward how it will appear to the end user. You want to have control over the technologies but you also want all the stakeholders to have an opportunity to contribute toward governance.”



Google has more deep data knowledge than any company in the world, and it is no slouch in the discipline of design. It’s only natural that the company would combine this expertise. Initially, the audience for the new data design guidelines was Google itself, but much as it did for Material Design, the company decided to publicize these best practices and encourage others to adopt them—anyone from app developers to everyday people who are left wondering why their PowerPoint chart sucks. “We started doing this internally as a way to guide [employees] through the do’s and don’ts of chart creation,” Lima tells Fast Company. “After conducting various research studies and partnering with teams across the company, the do’s and don’ts evolved into a set of high-level principles that were strongly rooted in Google-wide tenets crucial to the company’s growth, brand, and culture. These principles are meant to be generative and not prescriptive. We hope they can help any chart creator during ideation and evaluation.” The six principles read something like an introductory data design course.


Blockchain Technology: Enabling Enterprise Innovation

uncaptioned
The most satisfying finding from Deloitte is that business leaders are taking blockchain as seriously as we’d hoped. Deloitte found that “53 percent of respondents say that blockchain technology has become a critical priority for their organizations in 2019—a 10-point increase over last year.” That more than half of the respondents name blockchain technology as a critical priority is, in my eyes, the first tremor in what promises to be a substantial shake up of the business technology landscape. Accordingly, when the authors report that many leaders are focusing less on whether blockchain works (spoiler: it does) and more on what business models it might disrupt, they quote Deloitte Consulting LLP Principal Linda Pawczuk, Deloitte consulting leader for blockchain and cryptocurrency. She says, “We believe executives should no longer ask a single question about blockchain but, rather, a broad set of questions reflecting the role blockchain can play within their organization.”


Here Is A Look At Where Fintech Is Leading Us And Why


The business consultancy powerhouse reports that there is an abundance of fintech enterprises entering the market using novel business models and delivering fresh consumer offerings. Furthermore, says E&Y, the emerging fintech revolution is driving information sharing and the development of open-source Application Program Interfaces (APIs) as well as recent technological breakthroughs in artificial intelligence (AI) and biometrics. Around the world, lawmakers are following the example of Europe by promoting open access Application Programming Interfaces (APIs). By doing so, legislators desire to enhance consumer choice by increasing competition between banks and fintech enterprises. For new fintech firms, open-source APIs streamline the launch of new products and services and decrease costs customarily used for research and development. New fintech banks that build their organization around a digital business model represent the fastest growing segment of startups nurtured by this movement.


Image Classification Using Neural Networks in .NET

Image classification is one of the most common use cases for non-recurrent neural networks. The basic concept is that a neural network is given an input image, whose input layer has the same number of neurons as the pixels in the image (assuming the image is grayscale). Also depending on the number of classifications to be made available, this neural network should have the same number of output neurons. The neural network could use either convolutional, fully connected layers or a combination of both. Convolutional networks are faster as they squish the input image and convolute them using multiple kernels to extract important features. More details on convolution can be found here. Convolution greatly reduces the size of the fully connected networks which are used to classify the image after series of convolutions and pooling. As the neural network using appropriate activation functions can only have inputs and outputs as a double ranging from 0 to 1, to input an image to a neural network will require some pre-processing on the input end to normalize the pixels into this form.


Fortune 100 passwords corporate secrets left exposed on unsecured Amazon S3 server


Some of the world’s biggest companies have had 750GB worth of their innermost secrets revealed on unsecured Amazon S3 buckets, available for anybody to download – no password required. The startling revelation came from researchers at UpGuard, who discovered three publicly accessible Amazon S3 buckets related to Attunity, a leading provider of data integration and big data management software solutions, on May 13th 2019. The fact that Attunity is at the centre of the security breach is a concern, simply because of its impressive list of customers. On its website, the company boasts that it counts more than 2,000 enterprises and half the Fortune 100 in its customer base. According to screenshots published on UpGuard’s blog, Fortune 100 companies such as Netflix, Ford, and TD Bank were amongst those who had their data recklessly exposed. For instance, the researchers discovered files containing the usernames and passwords of Netflix database systems, and internal Ford presentations.



NotPetya Retrospective

Each of the companies impacted by NotPetya (and WannaCry before it) had some degree of security protection in place—the usual stuff like firewalls, antivirus, and patch management. That defense obviously wasn’t perfect or the attack would have been thwarted, but a perfect defense costs $∞ and is therefore impractical. As we deal with the realities of an imperfect defense, it becomes necessary to choose between preventative and reactive measures. Security expert Bruce Schneier makes the point on his resilience tag: ‘Sometimes it makes more sense to spend money on mitigation than it does to spend it on prevention.’ An investment in mitigation can also pay off in all kinds of ways that have nothing to do with attacks: that change that was just accidentally made to production when it should have been in test—fixed in seconds, by reverting to the last snapshot. NotPetya is unlikely to keep its ‘most devastating cyber attack’ title for long. There will be another attack, and we should expect it to be worse. Moving away from a trusted network model to a zero-trust model is the most effective way to defend against such attacks. But, effort should also focus on measures that allow speedy recovery.


Managing Machine Learning Models The Uber Way

Electric Vehicles
With access to the rich dataset coming from the cabs, drivers, and users, Uber has been investing in machine learning and artificial intelligence to enhance its business. Uber AI Labs consists of ML researchers and practitioners that translate the benefits of the state of the art machine learning techniques and advancements to Uber’s core business. From computer vision to conversational AI to sensing and perception, Uber has successfully infused ML and AI into its ride-sharing platform. Since 2017, Uber has been sharing the best practices of building, deploying, and managing machine learning models. Some of the internal tools and frameworks used at Uber are built on top of popular open source projects such as Spark, HDFS, Scikit-learn, NumPy, Pandas, TensorFlow and XGBoost. Let’s take a closer look at Uber’s projects in the ML domain. Michelangelo is a machine learning platform that standardized the workflows and tools across teams through an end-to-end system. It enabled developers and data scientists across the company to easily build and operate machine learning systems at scale.


Are You Choosing Fintech—or Is Fintech Choosing You?


The type of financial technology solutions that are best suited for any particular institution vary tremendously. Some institutions may be looking to digitize or modernize processes from within, others may be looking to add-on a single solution such as mobile payments. Fintech solutions could also involve data aggregation or lead generation activities as well as arrangements to buy assets, such as small business loans, from leading online lenders. Once the fintech solution is identified, each institution needs to identify the best strategy for itself to either compete or collaborate with emerging players—and capitalize on trends and capabilities to position itself with the most competitive advantage going forward. There are generally two broad strategies that financial institutions can pursue: invest in or build emerging technologies on your own, or buy, partner or network with fintech companies. ... Consider building in-house if there are sufficient internal resources, expertise and scale to innovate and customize unique capabilities. These strategies may be more appropriate for regional or larger banks than smaller community banks.



Quote for the day:


"There are some among the so-called elite who are overbearing and arrogant. I want to foster leaders, not elitists." - Daisaku Ikeda


Daily Tech Digest - June 28, 2019

MIT: We're building on Julia programming language to open up AI coding to novices

The system allows coders to create a program that, for example, can infer 3-D body poses and therefore simplify computer vision tasks for use in self-driving cars, gesture-based computing, and augmented reality. It combines graphics rendering, deep-learning, and types of probability simulations in a way that improves a probabilistic programming system that MIT developed in 2015 after being granted funds from a 2013 Defense Advanced Research Projects Agency (DARPA) AI program. The idea behind the DARPA program was to lower the barrier to building machine-learning algorithms for things like autonomous systems. "One motivation of this work is to make automated AI more accessible to people with less expertise in computer science or math," says lead author of the paper, Marco Cusumano-Towner, a PhD student in the Department of Electrical Engineering and Computer Science.  "We also want to increase productivity, which means making it easier for experts to rapidly iterate and prototype their AI systems."


Distributed Agile teams can help global enterprises reach their deployment and cost-reduction goals. Distributed teams reduce the overhead costs for an organization, and let it build a bigger pool of talented people than if the organization eschewed remote job candidates. In essence, location independence makes an organization much more agile and productive. However, these global teams must address some inherent collaboration challenges, such as differences in time zones, cultures and language barriers. For a distributed Agile team to succeed, each worker must make some extra effort. Project managers should strive to: arm the team with the right tools for communication and collaboration; understand personnel strengths and weaknesses; encourage transparency; hold regular meetings; set clear expectations for stakeholders and team members; adhere to engineering best practices and standards; focus on achievable milestones; and build awareness of different cultures. Let's look at some of the best practices that can help distributed Agile teams address these specific challenges.


Certain Insulin Pumps Recalled Due to Cybersecurity Issues

Certain Insulin Pumps Recalled Due to Cybersecurity Issues
In a statement, the FDA says it is warning patients and healthcare providers that certain Medtronic MiniMed insulin pumps have potential cybersecurity risks. "Patients with diabetes using these models should switch their insulin pump to models that are better equipped to protect against these potential risks," the FDA says. The potential risks are related to the wireless communication between Medtronic's MiniMed insulin pumps and other devices such as blood glucose meters, continuous glucose monitoring systems, the remote controller and CareLink USB device used with these pumps, the FDA warns. "The FDA is concerned that, due to cybersecurity vulnerabilities identified in the device, someone other than a patient, caregiver or healthcare provider could potentially connect wirelessly to a nearby MiniMed insulin pump and change the pump's settings. This could allow a person to over deliver insulin to a patient, leading to low blood sugar (hypoglycemia), or to stop insulin delivery, leading to high blood sugar and diabetic ketoacidosis (a buildup of acids in the blood)," the agency's statement says.


Determining data value by measuring return on effort

I'm always very encouraged when professionals from different architectural disciplines can converge on common ground. This can be rare event, so when it does happen, I like to call it out. Such an event has happened recently with a contact coming from the business architecture discipline, namely Robert DuWors, with us both trying to put some metrics around the measurement of data value in our respective areas of expertise. I believe that business architecture and information architecture are the two core pillars of the architecture of an enterprise. But practitioners of these interconnected disciplines can frequently rub badly against each other, each side devaluing the other's methods and approaches. So, to reach agreement across the two on what constitutes enterprise value of our efforts is a happy place to be. ...and together, we agreed that this represents the value of a specific item of data to the enterprise from both information and business perspectives. Now, of course, this may be refined over time, but it already contains most of the aspects that together, Robert and I believe are key to this metric... So, what does this equation gives us? It's in 3 major sections, which I will call ‘Horizons.’


SoftBank plans drone-delivered IoT and internet by 2023

SoftBank plans drone-delivered IoT and internet by 2023
Why the stratosphere? Well, one reason is that lower altitudes often have strong winds to deal with, including straight after storms. The companies say disaster communications could be a primary use case for the drones, and the stratosphere has a steady current. Also, because of the altitude, LTE and 5G coverage could be much more widespread than any alternative delivery mechanism implemented at a lower altitude. One High Altitude PlatformStation (HAPS), as the HAWK30’s genre of base stations are called, could provide service over about 125 miles in diameter, and about 40 HAPS could cover the entire Japanese archipelago. Whereas a set of older, tethered balloons (SoftBank developed one in 2011) might cover just six miles, SoftBank says. Others are aiming for the stratosphere, too. Newer balloons, such as Alphabet’s Loon, using tennis court-sized balloons also fly there. Softbank is a major provider of telecommunications in Japan, a country on the Pacific rim and prone to earthquakes. It is thus keen to find backup alternatives to wired, or even radio-based, ground assets that can get destroyed in natural disasters.


Five Facts on Fintech


From artificial intelligence to mobile applications, technology helps to increase your access to secure and efficient financial products and services. Since fintech offers the chance to boost economic growth and expand financial inclusion in all countries, the IMF and World Bank surveyed central banks, finance ministries, and other relevant agencies in 189 countries on a range of topics and received 96 responses. A new paper details the results of the survey alongside findings from other regional studies, and also identifies areas for international cooperation—including roles for the IMF and World Bank—and in which further work is needed by governments, international organizations, and standard-setting bodies. ... Awareness of cyber risks is high across countries and most jurisdictions have frameworks in place to protect financial systems. Most jurisdictions—79% of those with higher incomes according to the survey results—identified cyber risks in fintech as a problem for the financial sector.


The Windows 10 security guide: How to safeguard your business


Using the Windows Update for Business features built into Windows 10 Pro, Enterprise, and Education editions, you can defer installation of quality updates by up to 30 days. You can also delay feature updates by as much as two years, depending on the edition. Deferring quality updates by seven to 15 days is a low-risk way of avoiding the risk of a flawed update that can cause stability or compatibility problems. You can adjust Windows Update for Business settings on individual PCs using the controls in Settings > Update & Security > Advanced Options. In larger organizations, administrators can apply Windows Update for Business settings using Group Policy or mobile device management (MDM) software. You can also administer updates centrally by using a management tool such as System Center Configuration Manager or Windows Server Update Services. Finally, your software update strategy shouldn't stop at Windows itself. Make sure that updates for Windows applications, including Microsoft Office and Adobe applications, are installed automatically.


Use event processing to achieve microservices state management


Unfortunately, it's often confusing whether a process close to the event source actually maintains the state itself, or whether that state is somehow provided from outside the service. For this reason, it's essential to create unique transaction IDs for all state-dependent operations. You can use that ID to retrieve specific state information from a database or to drive transaction-specific orchestration. This ID also helps carry data from one phase of a transaction, or flow, to another. This setup is essentially a back-end approach to microservices state management. Front-end state management is another option. Control from the front end means that a user or edge process maintains the state data. This state information passes along through an event flow, and each successive process accumulates more information from the previous ones. Since this state information queues along with the events, you won't lose state data during a failure as long as the event stays intact. Also, when processes get scaled with more instances of the microservices involved, the event flow can provide the state data those processes need.


How to build disruptive strategic flywheels


Capabilities-driven strategy suggests that companies that have a clear way to play (WTP) that aligns with market demands, and that invest in a system of four to six differentiating capabilities that enable the company to excel at the WTP, are better positioned for success. But increasing clock speed changes the calculation. Today, the half-life of a competitive advantage may be fleeting. As industries are disrupted, players that have been successful within the context of one business cycle might need to rethink their differentiating capabilities, their investment portfolios, and possibly even their WTP more frequently and dynamically. Ford no longer just makes cars; it focuses instead on mobility solutions. Big oil companies are investing in renewable energy as a hedge against constraints on emissions. Amazon is competing with…everyone. As a result, it behooves organizations and managers to continually assess competitive moves, regulatory and technology evolution, and consumer preferences — and to adapt decisions in a dynamic fashion.


Smart Lock Turns Out to be Not So Smart, or Secure


Researchers are warning a keyless smart door lock made by U-tec, called Ultraloq, could allow attackers to track down where the device is being used and easily pick the lock – either virtually or physically. Ultraloq is a Bluetooth fingerprint and touchscreen door lock sold for about $200. It allows a user to use either fingerprints or a PIN for local access to a building. Ultraloq also has an app that can be used locally or remotely for access. When Pen Test Partners, with help from researchers identified as @evstykas and @cybergibbons, took a closer look they found Ultraloq was riddled with vulnerabilities. For starters, researchers found that the application programming interface (API) used by the mobile app leaked enough personal data from the user account to determine the physical address where the Ultraloq device was being used. ... API has no authentication at all,” researchers wrote. “The data is obfuscated by being base64 twice but decoding it exposes that the server side has no authentication or authorization logic. This leads to an attacker being able to get data and impersonate all the users,” researchers wrote.



Quote for the day:

"Humility is a great quality of leadership which derives respect and not just fear or hatred." -- Yousef Munayyer

Daily Tech Digest - June 27, 2019

Tracking down library injections on Linux

Tracking down library injections on Linux
The linux-vdso.so.1 file (which may have a different name on some systems) is one that the kernel automatically maps into the address space of every process. Its job is to find and locate other shared libraries that the process requires. One way that this library-loading mechanism is exploited is through the use of an environment variable called LD_PRELOAD. As Jaime Blasco explains in his research, "LD_PRELOAD is the easiest and most popular way to load a shared library in a process at startup. This environmental variable can be configured with a path to the shared library to be loaded before any other shared object." ... Note that the LD_PRELOAD environment variable is at times used legitimately. Various security monitoring tools, for example, could use it, as might developers while they are troubleshooting, debugging or doing performance analysis. However, its use is still quite uncommon and should be viewed with some suspicion. It's also worth noting that osquery can be used interactively or be run as a daemon (osqueryd) for scheduled queries. See the reference at the bottom of this post for more on this.



Responsible Data Management: Balancing Utility With Risks

To mitigate risks relating to data sharing, good protocols for information exchange need to be in place. Currently these exist bilaterally between certain organisations, but these should extend to apply multilaterally, to an entire sector or to an entire response to maximise impact. Another way to improve inter-agency data sharing is to use contemporary cryptographic solutions, which allows for data usage without giving up data governance. In other words, one organisation can run analyses on another organisation’s data and get aggregate outputs, without ever accessing the data directly.  There are a number of other data-management practices that can reduce the risks of the data falling into the wrong hands, such as ensuring that all computers in the field are password protected, and have firewalls and up-to-date antivirus software, operating systems and browsers. Additionally, the data files themselves should be encrypted. There are open-source programs that solve all of these tasks, so addressing them may be a matter of competence inside organisations rather than funding.


Insurer: Breach Undetected for Nine Years

Insurer: Breach Undetected for Nine Years
But despite the common challenges in detecting data breaches, the nine-year lag time at Dominion National is unusually high, some experts note. "Dominion National's notification of a breach nine years after the unauthorized access may be an unenviable record for detection," says Hewitt of CynergisTek. "This is unusual because it strongly suggests that they may not have been performing comprehensive security audits or performing system activity reviews." Tom Walsh, president of the consultancy tw-Security, notes: "I am surprised that they detected it dating that far back. Most organizations do not retain audit logs or event logs for that long. "Most disturbing is that an intruder or a malicious program or code could be into the systems and not previously detected. Nine years is beyond the normal refresh lifecycle for most servers. I would have thought that it could have been detected during an upgrade or a refresh of the hardware." Walsh adds that it is still unclear whether the incident is reportable under the HIPAA Breach Notification Rule. "They were careful in stating that there is no evidence to indicate that data was even accessed," he notes.


Going Beyond GDPR to Protect Customer Data

GDPR was something of a superstar in 2018. Searches on the regulation hit Beyoncé and Kardashian territory periodically throughout the year. In the United States, individual states began either exploring their own version of the GDPR or, in the case of California, enacting their own regulations. Other states that either enacted or strengthened existing data governance laws similar to the GDPR include Alabama, Arizona, Colorado, Iowa, Louisiana, Nebraska, Oregon, South Carolina, South Dakota, Vermont and Virginia. At this point, there is also a growing number of companies operating outside the EU that are ceasing operations with the EEA rather than taking on expensive changes to their business applications and practices and becoming subject to possible fines assessments. GDPR prosecutions continue, as do the filing of complaints and investigations. Each member country has its own listing of court cases in progress, so it’s a bit difficult to quantify just how many investigations and cases are active. 


Juniper’s Mist adds WiFi 6, AI-based cloud services to enterprise edge

wifi cloud wireless
“Mist's AI-driven Wi-Fi provides guest access, network management, policy applications and a virtual network assistant as well as analytics, IoT segmentation, and behavioral analysis at scale,” Gartner stated. “Mist offers a new and unique approach to high-accuracy location services through a cloud-based machine-learning engine that uses Wi-Fi and Bluetooth Low Energy (BLE)-based signals from its multielement directional-antenna access points. The same platform can be used for Real Time Location System (RTLS) usage scenarios, static or zonal applications, and engagement use cases like wayfinding and proximity notifications.” Juniper bought Mist in March for $405 million for this AI-based WIFI technology. For Juniper the Mist buy was significant as it had depended on agreements with partners such as Aerohive and Aruba to deliver wireless, according to Gartner.  Mist, too, has partners and recently announced joint product development with VMware that integrates Mist WLAN technology and VMware’s VeloCloud-based NSX SD-WAN. “Mist has focused on large enterprises and has won some very well known brands,” said Chris Depuy


DevSecOps Keys to Success

The most important elements of a successful DevSecOps implementation are automation and collaboration. 1) With DevSecOps, the goal is to embed security early on into every phase of the development/deployment lifecycle. By designing a strategy with automation in mind, security is no longer an afterthought; instead, it becomes part of the process from the beginning. This ensures security is ingrained at the speed and agility of DevOps without slowing business outcomes. 2) Similar to DevOps where there is close alignment between developers and technology operations engineers, collaboration is crucial in DevSecOps. Rather than considering security to be “someone else’s job,” developers, technology operations and security teams all work together on a common goal. By collaborating around shared goals, DevSecOps teams make informed decisions in a workflow where there is the biggest context around how changes will impact production and the least business impact to take corrective action.


Microsoft beefs up OneDrive security

Microsoft > OneDrive [Office 365]
The new feature - dubbed OneDrive Personal Vault - was trumpeted as a special protected partition of OneDrive where users could lock their "most sensitive and important files." They would access that area only after a second step of identity verification, ranging from a fingerprint or face scan to a self-made PIN, a one-time code texted to the user's smartphone or the use of the Microsoft Authenticator mobile app.  The idea behind OneDrive Personal Vault, said Seth Patton, general manager for Microsoft 365, is to create a failsafe so that "in the event that someone gains access to your account or your device," the files within the vault would remain sacrosanct. Access to the vault will also be on a timer, Patton said, that locks the partition after a user-set period of inactivity. Files opened from the vault will also close when the timer expires. As the feature's name implied, the vault is only for OneDrive Personal, the consumer-grade storage service, not for the OneDrive for Business available to commercial customers. Although OneDrive Personal is a free service - albeit with a puny 5GB of storage - many come to it from the Office 365 subscription service.


Disaster planning: How to expect the unexpected


Larger organisations will typically have the advantages of a greater resource reserve and multiple premises in different regions, but they can be slow to react and their communications can struggle. Smaller organisations can be much more adaptable and swifter to react, but rarely have resources in reserve and are usually based in one fixed location. As with all things, preparation is key, and therefore it is worth taking time to prepare business continuity strategies and disaster plans. Rather than being scenario-specific – having dedicated plans for different eventualities – organisations should take an agnostic approach with their business continuity strategies. “If your business recovery plan is written strictly for recovery, regardless of scenario, then you will be in the best shape, as you will know that whatever happens, the plan has been tested,” says Dan Johnson, director of global business continuity and disaster recovery at Ensono. “You will go to your backup procedures that keep your daily processes moving and make sure business is flowing.”


An IoT maturity model and tips for IoT deployment success


Nemertes compiled these results into an IoT maturity model that companies can use as their roadmap to success (see figure). The maturity model comprises four levels: unprepared -- the organization lacks tools and processes to address the IoT initiative; reactive -- the organization has platforms and processes to respond to but not proactively address the issue; proactive -- the organization has the tools and processes to proactively deliver on the issue as it is currently understood; and anticipatory -- the organization has the tools, processes and people to handle future requirements. The third of organizations in the survey with successful IoT deployments were likely to be at Level 2 or Level 3 IoT maturity, and in a couple of areas -- namely executive sponsorship, budgeting and architecture -- successful organizations far outshone organizations that were less successful. ... In addition, key IoT team members include the IoT architect, IoT security architect, IoT network architect and IoT integration architect. Most large organizations have different architects that encompass these responsibilities, though their job titles may not reflect it.


AI presents host of ethical challenges for healthcare

With respect to ethics, she observed that the massive volumes of health data being leveraged by AI must be carefully protected to preserve privacy. “The sheer volume, variability and sensitive nature of the personal data being collected require newer, extensive, secure and sustainable computational infrastructure and algorithms,” according to Tourassi’s testimony. She also told lawmakers that data ownership and use when it comes to AI continues to be a sensitive issue that must be addressed. “The line between research use and commercial use is blurry,” said Tourassi. To maintain a strong ethical AI framework, Tourassi believes fundamental questions need to be answered such as: Who owns the intellectual property of data-driven AI algorithms in healthcare? The patient or the medical center collecting the data by providing the health services? Or the AI developer? “We need a federally coordinated conversation involving not only the STEM sciences but also social sciences, economics, law, public policy and patient advocacy stakeholders” to “address the emerging domain-specific complexities of AI use,” she added



Quote for the day:


"Nobody in your organization will be able to sustain a level of motivation higher than you have as their leader." -- Danny Cox