Showing posts with label CCPA. Show all posts
Showing posts with label CCPA. Show all posts

Daily Tech Digest - July 30, 2020

The Challenges of Building a Reliable Real-Time Event-Driven Ecosystem

Building a dependable event-driven architecture is by no means an easy feat. There is an entire array of engineering challenges you will have to face and decisions you will have to make. Among them, protocol fragmentation and choosing the right subscription model (client-initiated or server-initiated) for your specific use case are some of the most pressing things you need to consider. While traditional REST APIs all use HTTP as the transport and protocol layer, the situation is much more complex when it comes to event-driven APIs. You can choose between multiple different protocols. Options include the simple webhook, the newer WebSub, popular open protocols such as WebSockets, MQTT or SSE, or even streaming protocols, such as Kafka. This diversity can be a double-edged sword—on one hand, you aren’t restricted to only one protocol; on the other hand, you need to select the best one for your use case, which adds an additional layer of engineering complexity. Besides choosing a protocol, you also have to think about subscription models: server-initiated (push-based) or client-initiated (pull-based). Note that some protocols can be used with both models, while some protocols only support one of the two subscription approaches. Of course, this brings even more engineering complexity to the table.


Successful Digital Transformation Requires a Dual-track Approach

This first part of the dual-track approach focuses on the identification and implementation of new digital tech throughout an organization, while also working to change cultures and business workflows impacted by the transformation, according to the report. While this step is critical, it is also complex and time consuming. The benefits may take time to come to fruition, which is why many executives are dissatisfied with current transformation results. Not only are executives impatient, but they don't have the second part of the dual-track to get them by, the report found. The second portion is a parallel track that hones in on areas overlooked in large-scale transformation tactics. These areas include the organization's ability to quickly connect and modernize hundreds of crucial processes that cross both business workflows and work groups, according to the report. This goal can be achieved through rapid-cycle innovation, which encourages business professionals outside of IT to propose and create new apps for updating existing workflow processes, with the goal of achieving quick wins for the company and supporting long-term transformation, the report found.


How deploying new-age technologies has changed the role of leadership amid COVID-19

Circumstances created by a pandemic, such as COVID-19 have been hugely disruptive and could even render organizations paralytic, if they are far removed from any understanding of how technology is an imperative and not optional add on. This is why it is critical to have a proactive mindset to technology, instead of a reactive approach. Proactive investment in technology is helping organizations reap maximum benefits as this approach allows leaders to prepare their people to embrace and become comfortable in using technology, so that it becomes spontaneously embedded in an organization at a fundamental level. The investments we proactively made many years ago, whether in secure virtual platforms or AI driven due diligence processes that help automate how we finalize our contracts, has helped us seamlessly adapt to working with minimum disruption. The biggest asset has been the spontaneous comfort level of our people in adapting to this transformed scenario of working from home, due to their prior high degree of familiarity with using technology platforms and processes at work over the past many few years, ensuring our ability to optimize productivity.


Anatomy of a Breach: Criminal Data Brokers Hit Dave

At the moment, however, some evidence points to ShinyHunters having phished Dave employees. The group has previously advertised - and has been suspected of being behind - the sale of millions of stolen records obtained from Indonesian e-commerce firm Tokopedia, Indian online learning platform Unacademy, Chicago-based meal delivery outfit HomeChef, online printing and photo store ChatBooks, university news site Chronicle.com, as well as Microsoft's private GitHub repositories, according to Baltimore-based security firm ZeroFox. How does ShinyHunters steal so much data? Cyble says that in a post to a hacking forum, a user called "Sheep" says of the Dave breach: "This database was dumped through sending GitHub phishing emails to Dave.com employees. The employees were found by searching for developers in the organization on LinkedIn/Crunchbase/Angel. All of the databases sold by ShinyHunters were obtained through this method. In some cases, [the] same method was used but for GitLab, Slack and Bitbucket."


IoT Security: How to Search for Vulnerable Connected Devices

Researchers offer many tools and ways to search for hacker-friendly IoT devices. The most effective methods have already been tested by botnet creators. In general, the use of certain vulnerabilities by botnets is the most reliable criterion for assessing the level of security of IoT devices and the possibilities of their mass exploitation. Searching for vulnerabilities, some attackers rely on the firmware (in particular, those errors that were discovered during firmware analysis using reverse engineering methods). Other attackers start looking for vulnerabilities searching for the manufacturer’s name. In any case, for a successful search, some kind of a distinctive feature of a vulnerable device is needed, and it would be nice to find several such features. ... There are really many vulnerabilities in IoT devices, but not all of them are easy to exploit. Some vulnerabilities require a physical connection, being near or on the same local network. The use of others is complicated by quick security patches. On the other hand, manufacturers are in no hurry to patch firmware and often admit it. Getting an accurate list of vulnerable IoT devices will require significant efforts, it is not just a one-time query.


Security: This nasty surprise could be waiting for retailers when they open up again

"A lot of retailers, when they come back online, they're going to be focused on business processes and getting employees back to work. They're not necessarily thinking, 'maybe I need to update Windows on my computer terminal', or update POS terminal firmware." In retail, where surges in online transactions during the pandemic have forced retailers to quickly transform their ecommerce capabilities, hackers have shifted their focus to make the most of this opportunity. This includes changing-up well-known types of attacks by using them in different ways, such as exploiting credit cards within a different type of merchant platform, and targeting parts of retailers' systems that might otherwise slip through the cracks. We've already seen new forms of attacks on retailers take place during the pandemic. In late June, researchers at security software firm Malwarebytes identified a new web-skimming attack , whereby cybercriminals concealed malware on ecommerce sites that would steal information typed into the payment input fields, including customers' names, address and card details.


Finland government funds work on potential quantum leap

The Finnish government has allocated €20.7m to the venture, which will be run as an innovation partnership open to international bidding. Closer to home, VTT-TRCF plans to cooperate with Finnish companies across the IT and industrial sphere during the various phases of the project’s implementation and application. The rapid advances in quantum technology and computing have the potential to provide societies with the tools to overcome major future problems and challenges, such as the Covid-19 pandemic, that remain out of the reach of contemporary supercomputers. Quantum technologies have the potential to complete complex calculations, which currently take days to do, orders of magnitude quicker. Making calculations that traditional computers are fundamentally unable to do, if practical, they would mark a leap forward in computing capability far greater than that from the abacus to a modern computer. Antti Vasara, the CEO of VTT-TRCF said: “The quantum computers of the future will be able to accurately model viruses and pharmaceuticals, or design new materials in a way that is impossible with traditional methods.”


What the CCPA means for content security

Simply installing an ECM system will not yield a secure content ecosystem. If there is one thing that all ECM experts agree on, it's installing an ECM system will accomplish nothing aside from consuming resources. People need to use the system to manage content -- and want to use it -- even after setting up the necessary security controls to meet the requirements of the CCPA. Deploying an ECM system that is so secure that people do not want to use it is a waste of resources. The ECM system does not need to be complicated. Setting up a secure desktop sync of content is an important first step in ease of use and adoption. Instead of just rolling it out, companies need to work with each group using the software first. The business must help users organize their content and set up a basic structure for storing content so that the system doesn't become disorganized. Depending on the system that a business is using, setting up a basic structure may include a basic taxonomy, content types, standard metadata or a combination of any of these. If a business implements its ECM system correctly, its largest challenge will be securing mobile devices and laptops. 


How blockchain could play a relevant role in putting Covid-19 behind us

Covid-19 has revealed the weaknesses of global supply chains with countless reports of PPE issues, a lack of food in impoverished areas, and a breakdown of business-as-normal, even in places where demand has remained constant. Trust has always been the keystone of trade. But how can you trust supply chain partners to deliver in times of widespread failure? Owing to its decentralised nature, blockchain-based applications create a transparent ecosystem when you trust — and see — that the mechanisms in place are fair to all. It can provide instant overviews of entire supply chains to highlight issues as soon as they arrive. What’s more, it is possible to implement live failsafes with smart contracts that can ensure the smooth continuation of the supply chain and remove the very need for trust in the first place. To this end, the World Economic Forum developed the Blockchain Deployment Toolkit, a set of high-level guidelines to help companies implement best practices across blockchain projects – especially those helping solve supply chain issues. They worked with more than 100 organisations for more than a year, delving into 40 different blockchain use cases, including traceability and automation, to help guide organisations in their efforts to solve real-world problems with blockchain.


The growing trend of digitization in commercial banking

“Technology has absolutely been at the forefront of all the changes we have seen and will see in upcoming years,” explained Rao. Even so, the business of banking has not changed on a fundamental level. Rather, products have become more commoditized; similar business products are being offered, but customers are using them in different ways. In Rao’s words, “the ‘what’ component has not changed, but the ‘how’ has.” This is where digitization has had the biggest impact. For example, commercial banking capabilities like making a payment or collecting a receivable have long been available for corporate entities. But today, the same capability can be offered in a way that emphasizes a great user experience—something that hasn’t always been a focal area in the commercial banking space. ... Large traditional banks are frequently riddled with outdated legacy systems on the back end of operations, which dilutes their offerings even with modern digital technology at the front end. These legacy systems make it costly to create the ideal customer experience, leading many banks to focus on implementing strategies that pave the path towards modernization. In certain cases, this means opening up and modernizing selective pieces of back-end systems to improve operations overall.



Quote for the day:

"Leadership has a harder job to do than just choose sides. It must bring sides together." --

Daily Tech Digest - March 10,2020

How can companies thrive under CCPA regulation?

How can companies thrive under CCPA regulation? image
One major challenge that Manley says companies deal with when it comes to data management under CCPA is dealing with consumer data that’s located across a number of devices and software infrastructures. He explained: “I think the biggest challenge we see a lot of people having is that they often don’t understand how many places are holding customer data. “They were thinking very much about the data that they had on their premises, maybe things that are in file servers, databases, corporate laptops, that sort of thing, and it takes a while to then realise that they’ve got a number of SaaS applications, whether it’s Salesforce, Slack or Office 365. ... According to Manley, the companies that manage to succeed while staying within CCPA boundaries use the regulation as an opportunity to reflect on their operations. “Regulations like CCPA are a good baseline for what your company should be doing anyway,” he said. “For a lot of the better organisations, we see them saying that the goal isn’t just to hit the baseline, but it’s to use this as a starting point for discussion about what we want to be as a business.



Impactful, but Overhyped AI

IoT AI
Many companies struggle with how to successfully integrate AI into their businesses. Lux Research released a report called “Artificial Intelligence: A Framework to Identify Challenges and Guide Successful Outcomes” that analyzes the market, outlines several challenges companies face in integrating AI, and hones in on several factors businesses should consider before investing in AI. The four factors the research firm suggests to help businesses make wise AI investments and decisions include: clearly understanding the outcomes implementing AI will provide for their businesses; focusing on an AI product’s capabilities instead of flashy marketing; knowing when the technology is mature enough to mitigate risk; and identifying practical challenges to both implementation and maintenance of the technology once it is in place. There’s no doubt that AI technologies can be impactful in helping companies achieve digital transformation, but there is also a lot of hype that is not necessarily helping the space and the players within it.


Multiple nation-state groups are hacking Microsoft Exchange servers

Microsoft Exchange
These state-sponsored hacking groups are exploiting a vulnerability in Microsoft Exchange email servers that Microsoft patched last month, in the February 2020 Patch Tuesday.The vulnerability is tracked under the identifier of CVE-2020-0688. ... This Exchange vulnerability is not, however, straightforward to exploit. Security experts don't see this bug being abused by script kiddies (a term used to describe low-level, unskilled hackers). To exploit the CVE-2020-0688 Exchange bug, hackers need the credentials for an email account on the Exchange server -- something that script kiddies don't usually have. The CVE-2020-0688 security flaw is a so-called post-authentication bug. Hackers first need to log in and then run the malicious payload that hijacks the victim's email server. But while this limitation will keep script kiddies away, it will not stop APTs and ransomware gangs, experts said. APTs and ransomware gangs often spend most of their time launching phishing campaigns, following which they obtain email credentials for a company's employees.


3 cloud architecture problems that need solutions


Many push as much as they can to the edge, but realize that you’re moving away from a centralized system (the public cloud), to many decentralized systems (the edge devices or servers). You need to understand that you must maintain these edge systems, and they are much more difficult to monitor, govern, secure, update, and configure. Multiply that effort by hundreds of edge computing devices and you've got an operational nightmare. Second, what to containerize? Many enterprises say containers are their strategy and not just an enabling technology. This almost religious belief in the power of containers has pushed many an application to the cloud in containers, but that’s really not how business should be moving there. The issue is that there are no hard and fast rules as to what can—and should—exist in a container. Legacy applications that will take a great deal of effort to refactor (rewrite) for containers are not likely candidates; however, in many instances, the cloud migration team attempts to move them first. This means that enterprises will fail to find value in containers for some of their applications that move to the cloud.


How to break down data silos: 4 obstacles and solutions

Silos
With the growth of shadow IT, vendor software and databases can come through virtually any departmental door. Systems from different vendors that departments independently buy don't necessarily interact well with each other. When this occurs, systemic data silos can arise because of cross-system and data integration failures. The best way to address this issue is to require interoperability and a full set of application programming interfaces (APIs) in the requests for proposal (RFP) that IT and individual business departments issue to vendors. One way to assure that system and data interoperability is a front-page requirement on RFPs is for IT to create a standard RFP that is required by purchasing or whichever department authorizes tech purchases. This standardized form can be used by IT and end-user departments. Most systems and databases sold by vendors have some type of APIs for data integration; however, totally seamless integration and the ability to easily aggregate data from disparate systems can never be assumed.


How CIOs can limit the business disruption of the coronavirus — Gartner

How CIOs can limit the business disruption of the coronavirus — Gartner image
With various quarantine measures and travel restrictions undertaken by organisations, cities and countries, uncertainties and disruptions are beginning to have more of an impact on businesses and their workforces. This increases the chance that business operations are either being suspended or run in a limited capacity. In response to this, Gartner are promoting the use of AI to automate some tasks particularly basic customer service protocols and candidate screenings. In its report, Gartner also recommends that in organisations where remote working capabilities have not yet been established, CIOs need to work out interim solutions, including using instant messaging for general communication, file sharing/meeting solutions, and access to enterprise applications such as enterprise resource planning (ERP) and customer relationship management (CRM). ... If it isn’t possible for organisations to meet their clients face to face, Gartner recommends using digital channels such as video calls and live streaming solutions that can serve various customer engagement and selling scenarios.


What Does A Typical Day Of A Data Scientist Look Like?

A day of a data scientist
Like every other professional, the day of a data scientist will be dotted with emails to answer and meetings to attend. But this is where the similarities end. Unlike in most jobs, each day throws up new challenges and unique problems for a data scientist. This comes in the form of varied projects, and that in turn changes with the industry they operate in. But despite the flux, what ties together each workday for them collectively are data-related tasks. Depending on your profile, you will either be – broadly speaking — pulling data, shaping it, merging it, or analysing it — all with the end goal of solving problems for businesses. This is accomplished by using a wide variety of tools that look for patterns or trends within a given data set, and trying to simplify data problems. ... As emphasised in the first point, the primary task of a data scientist is to be problem-solvers, and that cannot be achieved in silos. A typical day would involve engaging with stakeholders at multiple levels to determine the questions that need pointed answers. Not just that, it is their job to come up with different approaches to solve these problems.


Introducing Alpine.js: A Tiny JavaScript Framework

Ever built a website and reached for jQuery, Bootstrap, Vue.js or React to acheive some basic user interaction? Alpine.js is a fraction of the size of these frameworks because it involves no build steps and provides all of the tools you need to build a basic user interface. Like most developers, I have a bad tendency to over-complicate my workflow, especially if there’s some new hotness on the horizon. Why use CSS when you can use CSS-in-JS? Why use Grunt when you can use Gulp? Why use Gulp when you can use Webpack? Why use a traditional CMS when you can go headless? Every so often though, the new-hotness makes life simpler. Recently, the rise of utility based tools like Tailwind CSS have done this for CSS, and now Alpine.js promises something similar for JavaScript. In this article, we’re going to take a closer look at Alpine.js and how it can replace JQuery or larger JavaScript libraries to build interactive websites. If you regularly build sites that require a sprinkling on Javascript to alter the UI based on some user interaction, then this article is for you.


Job Trends For Data Scientists In The Next 5 Years

Data scientist
A trend that has emerged in recent times is that companies which earlier identified themselves as ‘non-tech’, are beginning to position themselves as tech companies, and this is likely to continue. A case in examples is banks. For instance, the term ‘analyst’ used in the context of this industry, might now be called a ‘data scientist’, as long as they are seeking to monetise the company’s data assets. One of the main drivers for this trend is the copious amounts of data available today – and this has been increasing exponentially. What is more, fuelled by the rise of (Internet of Things) IoT and social media, this growth is not expected to slow down anytime soon. The IoT market in India alone is reportedly likely to reach a whopping 2 billion connections by 2022. This is buttressed by the fact that not only are more devices coming online but with greater improvements in hardware, the type of data delivered will be more diverse. The same goes for social media. According to Hootsuite, the number of social media users worldwide in 2019 rose up to 3.484 billion — recording an increase of 9% y-o-y.


Huawei P40 Pro expected to have 7 cameras, 10x optical zoom, and 5G support


According to known Apple leaker Ming Chi Kuo, a 10x optical zoom camera could be included as one of the sensors in the P40 Pros camera system, making it the world's first phone to achieve such a feat. The Mate 30 Pro featured a quad-camera set-up, and included a 50x digital zoom and a 5x optical zoom, which catapulted it into the mobile hall of fame.  Optical zoom is achieved by switching from a wide-angle camera to a telephoto camera. The magnification number is a reflection of the difference of those two lens lengths. Using the telephoto camera without "pinching in" results in a higher-quality image instead of using digital zoom which is what happens when you pinch the screen of your phone while using the main camera --- or when you try to zoom in beyond the telephoto camera's capabilities. According to GizChina, the P40 Pro's rear camera will come with a 52-megapixel Sony IMX700 sensor, which is 10 megapixels higher than P30 Pro's rear camera. The 52-megapixel sensor is significantly lower in terms of resolution than Samsung's Galaxy S20 Ultra 108-megapixel sensor, but reports suggest this new sensor can bring bigger pixels and better low-light image quality.



Quote for the day:


"The captain of a ship can run a great ship, but he can't do anything about the tides." -- Matthew Norman


Daily Tech Digest - December 30, 2019

Doing the right thing: The rise of ethics in tech

Doing the right thing: The rise of ethics in tech header
"Culture means a lot of things," Schlesinger continued. "Culture in the broadest terms—in terms of tools, processes, norms, narratives—is bringing all of the things that ladder up to creating the kind of organization that is needed to then build the kind of products, features, and tools that society can benefit from." Today, we're at the point with ethics in technology that we were with automobiles in 1966, he noted, after Ralph Nader's 1965 book, "Unsafe at Any Speed," exposed and heightened awareness around the dangerous engineering practices involved in building cars at the time, resulting in new safety initiatives. "We've all awakened, and it's kind of unique that we're even having this conversation at a mainstream tech conference," Schlesinger said. "We're at the beginning stages of this evolution toward that kind of informed, just, rewarding culture in tech that holds itself accountable for the kinds of things we want to build. And ultimately, that is about showing our moral math." However, Paula Goldman, chief ethical and humane use officer at Salesforce, argued that we're not in 1966 but the early 1900s, with its waves of innovation and new norms.



Financial Services Could Never Do This Before

Financial Services Could Never Do This Before
The cloud offers a tremendous new opportunity to scale your infrastructure on-demand and offload some of the expense of data management, especially as it relates to new workloads or testbed environments. Yet the reality is, for most financial services institutions, much of the data resides on-premise in data centers and will continue to for a long time – dictated by regional jurisdictions, data security concerns or just historical preference to control the data. Financial services organizations need a new approach. Flexibility to manage data across environments is critical. Today, organizations need an enterprise data cloud that offers the ability to ingest, process, store, analyze, model any type of data (structured, unstructured, or semi-structured data), regardless of where it lands — at the edge, on premise, in the data center, or in any public, private, or hybrid clouds.


GDPR: Moving Beyond Compliance


While more organisations move to develop a senior leadership approach to data privacy, in the year and a half since GDPR, a growing number of businesses are trying to put data privacy on the radar of their entire employee base. In these organisations, it is becoming everyone’s mission to have an understanding of provenance and the use of information, with everyone taking accountability for how the organisation collects, uses, and shares personal information. The idea of accountability is that “we say what we do and we do what we say” and, importantly, “we stand by doing what we do.” This culture of accountability is something that is also being extended to how organisations talk to their customers about data privacy. Increasingly, businesses are being open and inclusive, telling customers about what they are doing with personal information and how they are protecting it. In doing so, they recognise the need to close the gap in terms of the expectations, responsibilities, and actions relevant to privacy protections and information ethics. With big data breaches, such as recent ones that exposed the data of almost 400 million people, it is no wonder that the general public is becoming wary about parting with their personal information.


Cisco 2020: Challenges, prospects shape the new year


Cisco is attacking the cloud provider market by addressing its hunger for higher bandwidth and lower latency. At the same time, the vendor will offer its new technology to communication service providers. Their desire for speed and higher performance will grow over the next couple of years as they rearchitect their data centers to deliver 5G wireless services to businesses. For the 5G market, Cisco could combine Silicon One with low-latency network interface cards from Exablaze, which Cisco plans to acquire by the end of April 2020. The combination could produce exceptionally fast switches and routers to compete with other telco suppliers, including Ericsson, Juniper Networks, Nokia and Huawei. Startups are also targeting the market with innovative routing architectures. "Such a move could give Cisco an edge," said Tom Nolle, president of networking consultancy CIMI Corp., in a recent blog.


Don’t Let Impostor Syndrome Derail Your Next Interview


Even when you’re well prepared for an interview and know that you’re perfectly qualified for the job, it can still be a nerve-racking experience to walk into a room full of strangers and prepare to be judged. To manage your jitters, start by controlling the controllable elements of your interview experience. If you’re worried about arriving punctually, for example, try taking multiple routes to your destination before the day of the interview to see which one gets you there fastest, with the least amount of traffic. Managing nervousness around the interview itself is another area where you can be proactive. In Cliff’s case, he decided to build in extra time before the interview for a 10-minute walk around the block. During this scheduled pre-meeting stroll, Cliff planned to focus on deep breathing to help ratchet down his stress response. I recommended that while walking, he take a minute or two to inhale for a count of four seconds, hold his breath for two seconds, and then exhale for a count of four seconds. He found this process deeply calming, and it allowed him to enter the interview setting feeling more confident and settled.


5 Lessons George Lucas Taught Us About Innovation

Image: Pixabay
Experimentation is an important part of any innovative team. In 1979, Lucas created The Graphics Group as part of Lucasfilm’s computer division and hired Edwin Catmull to lead it. The goal of this group was to invent new digital production tools for use in live action films. They were successful in this goal and even created software used in medical and satellite imagery. However, Catmull’s team really longed to create full-length computer-generated imagery (CGI) animated films. As they struggled to build a profitable business, neither were achieving their goals. Lucas put it this way, “I didn’t want to run a company that sold software, and John [Lasseter] and Ed wanted to make animated films.” Eventually it became clear that… Someone had to be the first to push the boundaries of blending CGI and live action beyond short special-effects shots. No longer was storytelling constrained by the limitation of a human actor. This risk gave other filmmakers a platform to build from, slowly crafting new characters to where we are today; where CGI characters are nearly indistinguishable from human ones.


The Evolution Of Data Protection

Photo:
DLP is only as good as the classification rigidity enforced by the organization. Classification is always too rigid and can't keep up with fluid data movement. For DLP to prevent data from egress, data must be classified correctly. Classification is complicated and fragile. What is sensitive today is not sensitive tomorrow and vice versa. Classification turns into an endless battle of users trying to manage the classification of data. Ultimately, classification and DLP deteriorate over time. DLP adds an extremely high operational overhead, as it requires users to be classification superstars, and even then, mistakes will happen. Desjardins Group, a Canadian bank, recently made news for a malicious insider who obtained information on 2.7 million customers and over 170,000 businesses. The exact details of the breach haven't been made public yet, but DLP solutions are standard in all financial institutions. PGP's encryption is a privacy tool. Users can encrypt their data so others can't access it, but PGP fails once users try to share data with other users.


California’s privacy law means it’s time to add security to IoT

California law requires IoT to have security.
If you think about the evolution of the marketplace, we’re at a state now where the technology has gotten us to a certain point. We have connectivity. Wi-Fi has gotten to a point where it’s ubiquitous. Access to the internet is pretty pervasive around the world. That’s spun this billions-of-units vision, saying that everything is going to be connected. That’s interesting, and from a technology perspective, we’re seeing that in our houses. We see the ubiquity of these connected devices in our homes. But what quickly happens is you get what I call a normative period where societal issues come to the fore, the biggest one being privacy. You go into this normative period now where everyone says we need privacy, and then you have to have some sort of governance over the devices to create an environment where you can deliver that capability in a cost-effective way. What I’m saying specifically, as it relates to privacy today, is that there are no standards. There is no threshold. Therefore, these devices can be anywhere from having zero security capability to everything in between, across the spectrum.


IoT vendor Wyze confirms server leak

Wyze
Song confirmed that the leaky server exposed details such as the email addresses customers used to create Wyze accounts, nicknames users assigned to their Wyze security cameras, WiFi network SSID identifiers, and, for 24,000 users, Alexa tokens to connect Wyze devices to Alexa devices. The Wyze exec denied that Wyze API tokens were exposed via the server. In its blog post, Twelve Security claimed they found API tokens that they say would have allowed hackers to access Wyze accounts from any iOS or Android device. Second, Song also denied Twelve Security's claims they were sending user data back to an Alibaba Cloud server in China. Third, Song also clarified Twelve Security claims that Wyze was collecting health information. The Wyze exec said they only collected health data from 140 users who were beta-testing a new smart scale product. Song didn't deny Wyze collected height, weight, and gender information. He did, however, deny others. "We have never collected bone density and daily protein intake," the Wyze exec said. "We wish our scale was that cool."


How one bizarre attack laid the foundations for the malware taking over the world


The first instance of what we now know as ransomware was called the AIDS Trojan because of who it was targeting – delegates who'd attended the World Health Organization AIDS conference in Stockholm in 1989. Attendees were sent floppy discs containing malicious code that installed itself onto MS-DOS systems and counted the number of the times the machine was booted. When the machine was booted for the 90th time, the trojan hid all the directories and encrypted the names of all the files on the drive, making it unusable. Victims saw instead a note claiming to be from 'PC Cyborg Corporation' which said their software lease had expired and that they needed to send $189 by post to an address in Panama in order to regain access to their system. It was a ransom demand for payment in order for the victim to regain access to their computer: that made this the first ransomware. Fortunately, the encryption used by the trojan was weak, so security researchers were able to release a free decryption tool – and so started a battle that continues to this day, with cyber criminals developing ransomware and researchers attempting to reverse engineer it.



Quote for the day:


"The leader has to be practical and a realist, yet must talk the language of the visionary and the idealist." -- Eric Hoffer


Daily Tech Digest - December 27, 2019

Exposed databases are as bad as data breaches, and they're not going anywhere


If your data is exposed in an unsecured database, experts say you have to treat the situation the same way you would if the data had been stolen. "You need to engage proactively in minimizing your risk," said Eva Velasquez, president of the Identity Theft Resource Center. Medical service provider Tu Ora Compass Health said the same thing to nearly 1 million patients when it revealed that its poorly configured website had exposed patient health insurance data. Patients should "assume the worst" and act as though hackers had accessed the data, the company said. What's the worst that can happen? Stolen information makes it easier for identity thieves to pretend to be you. When combined with what you share on social media, for example, your medical record number could allow someone else to use your health insurance. The Identity Theft Resource Center hosts a service called Breach Clarity that helps you decide what steps to take after your data is compromised. The advice depends on what kind of information was involved. If your log-in credentials are exposed, you'll want to reset your passwords. If it's your Social Security number, you'll want to watch your credit report for signs that someone's opening up new lines of credit in your name.



Introduction to ELENA Programming Language

Methods in ELENA are similar to methods in C# and C++, where they are called "member functions". Methods may take arguments and always return a result (if no result provided "self" reference is returned). The method body is a sequence of executable statements. Methods are invoked from expression, just as in other languages. There is an important distinction between "methods" and "messages". A method is a body of code while a message is something that is sent. A method is similar to a function. in this analogy, sending a message is similar to calling a function. An expression which invokes a method is called a "message sending expression". ELENA terminology makes a clear distinction between "message" and "method". A message-sending expression will send a message to the object. How the object responds to the message depends on the class of the object. Objects of differents classes will respond to the same message differently, since they will invoke different methods. Generic methods may accept any message with the specified signature.


Amazon now allows developers to combine tools such as Amazon QuickSight, Aurora, and Athena with SQL queries and thus access machine learning models more easily. In other words, developers can now access a wider variety of underlying data without any additional coding, which makes the development process faster and easier. Amazon’s Aurora is a MySQL-compatible database that automatically pulls the data into the application to run any machine learning model the developer assigns it. Then, developers can use the company’s serverless system known as Athena to obtain additional sets of data more easily. Finally, the last piece of the puzzle is QuickSight, Amazon’s tool used for creating visualizations based on available data. The combination of these three tools will provide a far more efficient approach to the development of machine learning models. During the announcement, Wood also mentioned a lead-scoring model that developers can use to pick the most likely sales targets to convert.


istock-802780432.jpg
Ranking the obstacles involved in firewall management, 67% of those surveyed pointed to the initial deployment and tuning measures, 67% cited the process of implementing changes, and 61% referred to the procedure for verifying changes. Cost is another hurdle with firewalls. Depending on the size of the organization and the type of firewall, a single unit can cost anywhere from hundreds to thousands to tens of thousands of dollars and up. Some 68% of the respondents said they have a hard time receiving the necessary initial budget to purchase firewalls, while 66% bump into difficulty getting the funding to operate and maintain them. Tweaking the rules on a firewall is yet another taxing task. Changes to code, applications, and processes can occur fast and furiously, requiring frequent updates to firewall rules. But a single firewall update can take one to two weeks, according to the survey. And such changes can sometimes be trial and error. More than two-thirds of the respondents cited the difficulty of testing changes to firewall rules before deploying them. The lack of a proper testing platform can lead to misconfigured rules that break applications.


Hugh Owen, Executive Vice President, Worldwide Education at MicroStrategy asserts "Enterprise organizations will need to focus their attention not just on recruiting efforts for top analytics talent, but also on education, reskilling, and upskilling for current employees as the need for data-driven decision making increases—and the shortage of talent grows." Skills shortages show up everywhere, especially in AI. John LaRocca, Managing Director for Europe/NA Operations at Fractal Analytics, comments that "The demand for AI solutions will continue to outpace the availability of AI talent, and businesses will adapt by enabling more applications to be developed by non-AI professionals, resulting in the socialization of the process."  In that same vein, noted industry expert Marcus Borba, at Borba Consulting, remarks, in a report from MicroStrategy, that "the demand for development in machine learning has increased exponentially. This rapid growth of machine learning solutions has created a demand for ready-to-use machine learning models that can be used easily and without expert knowledge."


Google Publishes Its BeyondProd Cloud-native Security Model

In zero-trust networking, protection of the network at its outer perimeter remains essential. However, going from there to full zero-trust networking requires a number of additional provisions. This is by no means easy, given the lack of standard ways to do it, adds Brunton-Spall: You can understand [it] from people who've done this, custom-built it. If you want to custom build your own, you should follow the same things they do. Go to conferences, learn from people who do it. Filling this gap, Google's white-paper sets a number of fundamental principles which complement the basic idea of no trust between services. Those include running code of known provenance on trusted machines, creating "choke points" to enforce security policies across services, defining a standard way to roll out changes, and isolating workloads. Most importantly, These controls mean that containers and the microservices running inside them can be deployed, communicate with one another, and run next to each other, securely, without burdening individual microservice developers with the security and implementation details of the underlying infra structure.


apples oranges slices mixture puzzle balance opposites fruit  savatore gersace flickr
What if we’re leading change all wrong? The book “Make it Stick: The Science of Successful Learning,” by Peter C. Brown, Henry L. Roediger III and Mark A. McDaniel highlights stories and techniques based on a decade of collaboration among eleven cognitive psychologists. The authors claim that we’re doing it all wrong. For example, we attempt to solve the problem before learning the techniques to do so successfully. Using the right techniques is one of the concepts that the authors suggest makes learning stickier. Rolling out data-management initiatives is complex and usually involves a cross-functional maze of communications, processes, technologies, and players. Our usual approach is to push information onto our business partners. Why? Well, of course, we know best. What if we changed that approach? This would be uncomfortable, but we are talking about getting other people to change, so maybe we should start with ourselves. Business relationship managers stimulate, surface, and shape demand. They’re evangelists for IT and building organizational convergence to deliver greater value. There’s one primary method to accomplish this: collaboration.


Setting Management Expectations in Machine Learning

Business leaders often forget that machine learning algorithms are not a panacea that can be thrust into a given use case and expected to magically deliver value on their own. Algorithms rely on large, accurate, datasets to train and generate predictions. Data science is just the end result of a long process of data collection, cleansing, and tagging that requires significant investment. That’s why it’s important to have a robust Data Governance strategy in place at your business. Unfortunately, management often forgets this. Having failed to make the necessary investments in Data Governance, they nonetheless expect their data scientists to “figure it out.” Even where management has made the necessary investments in Data Governance and you have access to a large, healthy, internal dataset, there are certain functions you will still have difficulty performing. These most prominently include anything that requires you to leverage customer data. The frequency of widespread breaches and scandals involving the misuse of data, along with the accompanying rise in government regulation, has made it more difficult than ever to leverage customer data within businesses’ ML systems.



"As more states follow California's lead and push forward with new privacy laws, we'll likely see increased pressure on the federal government to take a more proactive role in the privacy sphere," said Mary Race, a privacy attorney in California. The Senate Commerce Committee held a hearing in December to discuss two potential frameworks, both of which seek to set a federal standard and designate regulators to enforce the law. Lawmakers expressed bipartisan support for privacy laws though no legislation has moved forward. Still, several key aspects of a prospective law were up for debate at the hearing. The Republican framework, submitted by Sen. Roger Wicker of Mississippi, would preempt state data privacy laws, and would limit enforcement to the FTC. Sen. Maria Cantwell of Washington, who submitted the Democratic bill, has said she's considering letting consumers directly sue companies, and would not supersede state laws. While federal law supersedes state law in general, many federal laws leave room for states to enact tougher requirements on top of the baseline set by US legislators.



How Data Subject Requests are at the heart of protecting privacy

Not only has data proliferated, but it’s also mutated into derivative forms. Customer data is often collected across multiple channels without being linked to a master identifier, and the definition of what is considered PII is continuing to change. The other reason the DSR search process is difficult is that many organizations still rely on questionnaires and spreadsheets for data discovery. These manual processes are inefficient at best, and incredibly inaccurate at worst. Consider that a single bank transaction might be replicated across 100 systems. Successfully fulfilling a DSR for that customer could require multiple people to manually search all those systems, and the accuracy and completeness may be questionable. Not only would the individual’s privacy be compromised, but the bank would also have to defend the results with regulators. In an age of big data and automation, relying on manual processes to fulfill privacy laws seems unbelievably arcane, if not impossible given the sheer volume of data companies have. Fortunately, many organizations are beginning to realize the complexity and importance of the DSR process and are looking to automate it.



Quote for the day:


"People not only notice how you treat them, they also notice how you treat others." -- Gary L. Graybill


Daily Tech Digest - December 11, 2019

SR-What you need to know-image.png
Segment Routing uses a routing technology or technique known as source packet routing. In source packet routing, the source or ingress router specifies the path a packet will take through the network, rather than the packet being routed hop by hop through the network based upon its destination address. However, source packet routing is not a new concept. In fact, source packet routing has existed for over 20 years. As an example, MPLS is one of the most widely adopted forms of source packet routing, which uses labels to direct packets through a network. In an MPLS network, when a packet arrives at an ingress node an MPLS label is prepended to the packet which determines the packet’s path through the network. While SR and MPLS are similar, in that they are both source-based routing protocols, there are a few differences between them. One of these key differences lies in a primary objective of SR, which is documented in RFC7855, “The SPRING [SR] architecture MUST allow putting the policy state in the packet header and not in the intermediate nodes along the path.


Never Mind Consumers, This Was a Year of Steady Infrastructural Progress

Much of the traction that does not come from exchanges or trading has been generated decidedly in infrastructure layers in 2019. Node infrastructure provider Blockdaemon, having recognized the market’s propensity to proliferate new decentralized networks, is generating revenue across an impressive 22 such networks today and continues to grow month over month. The Graph is serving over 400 public smart contract subgraphs, with request volume clocking millions of daily data queries. Meanwhile, 3Box’s self-sovereign identity and data solution is rapidly integrating across the Ethereum ecosystem, within wallets like MetaMask and many of the new user onboarding solutions, like Portis and Authereum, and even governance experiment MolochDAO.  Blockchain’s road to mainstream adoption depends on institutional backing of businesses that support blockchain infrastructure and enable traditional investors both to capitalize and participate in digital asset networks. As such, the compliance levels of exchanges have been increasing to support institutional clients.


5G and Me: And the Golden Hour


The connected ambulance 5G network slicing concepts were demonstrated at the Mobile World Congress (MWC) in Barcelona, Spain in Feb 2019 by Dell EMC Cork Centre of Excellence (CoE). Network slicing is a type of virtual networking architecture similar to software-defined networking (SDN) and network functions virtualization (NFV) whose goal is software-based network automation. This technology allows the creation of multiple virtual networks on a shared physical infrastructure. ... The goal for the future of connected care in emergencies would be to identify the conditions for Stroke, CHF & MI; measure and score at site, predictively collect Electronic Medical Record (EMR) metadata in conjunction with specific image studies via DICOM (Digital Imaging and Communications in Medicine) and combine this with the metadata from disease-specific epidemiological studies for that geographic region — all within the “golden hour”. This combinatorial analysis at the “point of care” is the future and can prevent disability and death at scale — especially since not all the ambulance visits are emergencies.


Google proposes hybrid approach to AI transfer learning for medical imaging


In transfer learning, a machine learning algorithm is trained in two stages. First, there’s retraining, where the algorithm is generally trained on a benchmark data set representing a diversity of categories. Next comes fine-tuning, where it is further trained on the specific target task of interest. The pretraining step helps the model to learn general features that can be reused on the target task, boosting its accuracy. According to the team, transfer learning isn’t quite the end-all, be-all of AI training techniques. In a performance evaluation that compared a range of model architectures trained to diagnose diabetic retinopathy and five different diseases from chest x-rays, a portion of which were pretrained on an open source image data set, they report that transfer learning didn’t “significantly” affect performance on medical imaging tasks. Moreover, a family of simple, lightweight models performed at a level comparable to the standard architectures. In a second test, the team studied the degree to which transfer learning affected the kinds of features and representations learned by the AI models. They analyzed and compared the hidden representations in the different models trained to solve medical imaging tasks, computing similarity scores for some of the representations between models trained from scratch and those pretrained on ImageNet.


Robotic exoskeletons: Coming to a factory, warehouse or army near you, soon 

ford-exoskeleton1.jpg
Ford is thought to be one of the bigger users of exoskeletons to date, but other car makers are deploying exoskeletons, although several have opted for build-your-own rather than off the shelf systems. Hyundai debuted its own exoskeleton vest, the VEX, earlier this year. The back-worn exoskeleton "is targeted at production-line workers whose job is primarily overhead, such as those bolting the underside of vehicles, fitting brake tubes, and attaching exhausts", Hyundai said, and is expected to be rolled out at Hyundai plants. GM meanwhile has teamed up with NASA to create a robotic glove that can help increase the amount of force a wearer can exert when gripping an object or lifting up a piece of equipment for long periods, cutting the likelihood of strain or injury. Closer to home, the construction industry is also shaping up to be another significant user of exoskeletons. Builder Wilmott Dixon, for example, started piloting the ExoVest at a Cardiff site last year. One factor driving the rollout of exoskeletons in both the construction and auto industries is the possibility of cutting worker injuries as well as enabling skilled staff to work for longer.


What does it mean to think like a data scientist?

Art is a very important part of that, because what we find in a lot of our data science engagements is there's a lot of exploration of what might be possible, the realm of what's possible. So, we tried to empower the power of ‘might,’ right? That might be a good idea, that might be something, because if you don't have enough might ideas, you never have anything, any breakthrough ideas. And so, this art of thinking like a data scientist, this kind of says, 'Yeah, there's a data science process.' But think about it as guardrails, not railroad tracks. And we're going to bounce in between these things. And oh, by the way, it's really important that your business stakeholders, your subject matter experts, also understand how to think like a data scientist in this kind of non-linear creative kind of fashion, so you come up with better ideas. Because we're all in search of variables and metrics that might be better predictors of performance, right? And the data science team will have some ideas from their past experience. 



Teams are struggling to implement these new tools and 71 percent said that they are adding security technologies faster than they are adding the capacity to proactively use them. This added complexity is also compromising their threat response with 69 percent of security decision makers surveyed saying that their security team currently spends more time managing security tools than effectively defending against threats. To make matters worse, a majority of enterprises are less secure today as a result of security tool sprawl and over half (53%) say their security team has reached a tipping point where the excessive number of security tools in place adversely impacts their organization's security posture. ReliaQuest's CEO, Brian Murphy provided further insight on the report's findings, saying: "Cyber threats continue to rise and require companies to mitigate risk. While it's tempting to think another piece of technology will solve the problem, it's far from true -- in fact, this survey proves more tools can worsen enterprise security by adding complexity without improving outcomes.


There’s No Opting Out of the California Consumer Privacy Act

For starters, GDPR applies to all European data but is a minimum requirement. Individual countries in the EU have their own laws that are often more restrictive. Alternatively, CCPA is applicable to California data only and excludes any data that is already covered by a federal law, such as HIPAA or GLBA. While GDPR protects personal information (PI) that could potentially identify a specific individual -- including name, address, telephone number and Social Security number (SSN) -- CCPA goes beyond to include product purchase history, social media activity, IP addresses, and household information. Under CCPA, companies are required to include a single, clear and conspicuous "Do Not Sell My Personal Information" link on homepages. Alternatively, GDPR offers various opt-out rights, each of which requires individual action.  Under GDPR, administrative fines can reach 20 million euros or 4% of annual global revenue, whichever is greatest. For CCPA, the California Attorney General can fine companies $2,500 per violation or up to $7,500 for each intentional violation.


Google Chrome can now warn you in real time if you're getting phished


Between July and September, Google sent more than 12,000 warnings about state-sponsored phishing attacks targeting its users in the US. According to Verizon's annual cybersecurity report, phishing is the leading cause of data breaches, and Google said in August that it blocked about 100 million phishing emails every day. But phishing links don't just come in emails: They can also appear in malicious advertisements, or through direct messages on chat apps. For those of you using a Chrome browser, Google is launching an extra level of protection against phishing through real-time checks on site visits. You can turn it on by enabling "Make searches and browsing better" in your Chrome settings. This protection was already available for Chrome's Safe Browsing mode, which checked the URL of every website visited and made sure it was not on Google's block list. The block list is saved on devices and only synced every 30 minutes, allowing savvy hackers to bypass the filter by creating a new phishing URL before the list updates.


Big Changes Are Coming to Security Analytics & Operations

Nearly two-thirds (63%) of survey respondents claim that security analytics and operations are more difficult today than they were two years ago. This increasing difficulty is being driven by external changes and internal challenges. From an external perspective, 41% of security pros say that security analytics and operations are more difficult now due to rapid evolution in the threat landscape, and 30% claim that things are more difficult because of the growing attack surface. Security teams have no choice but to keep up with these dynamic external trends. On the internal side, 35% of respondents report that security analytics and operations are more difficult today because they collect more security data than they did two years ago, 34% say that the volume of security alerts has increased over the past two years, and 29% complain that it is difficult to keep up with the volume and complexity of security operations tasks. Security analytics/operations progress depends upon addressing all these external and internal issues.



Quote for the day:


"Growth is painful. Change is painful. But nothing is as painful as staying stuck somewhere you don't belong." -- Mandy Hale