Showing posts with label data mining. Show all posts
Showing posts with label data mining. Show all posts

Daily Tech Digest - December 04, 2023

Proactive, not reactive: the path to ensuring operational resilience in cybersecurity

Operational resilience goes beyond ensuring business continuity by mitigating disruptions as and when they occur. Resilience needs a proactive approach to maintaining stable and reliable digital systems, regardless of the severity of threat incidents. This "bankability" (excuse the pun) of the financial system is critical to preserving public trust and confidence in the global financial system. Given the interconnectedness of financial firms with external third parties, any plan for operational resilience needs to address multiple lines of communication, automated systems of interactions and information sharing, and a growing attack surface. ... The dependence of the financial sector on the telecom and energy industries, and the increasingly global nature of the sector means that operational resilience exercises need to not just be cross-border, but cross-sector too. Today, national or even global-level threats are a reality, emphasizing the need to include government partners in the exercises. After all, protecting critical private infrastructure safeguards a nation's financial stability.


Black-Box, Gray Box, and White-Box Penetration Testing: Importance and Uses

Grey-box penetration testing can simulate advanced persistent threat (APT) scenarios in which the attacker is highly sophisticated and operates on a longer time scale (CISA, 2023). In these types of attacks, the threat actor has collected a good deal of information about the target system—similar to a gray-box testing scenario. Grey-box penetration testing allows many organizations to strike the right balance between white-box and black-box testing. ... The main disadvantage of gray-box testing is that it can be too “middle-of-the-road” when compared with black-box or white-box testing. If organizations do not strike the right balance during gray-box testing, they may miss crucial insights that would have been found with a different technique. ... Black box, grey box, and white box testing are all valuable forms of penetration testing, each with its own pros, cons, and use cases. Penetration testers need to be familiar with the importance and use cases of each type of test to execute them most efficiently, using the right tools for each one.


The arrival of genAI could cover critical skills gaps, reshape IT job market

While genAI offers the promise of clear business benefits, education is key and collaboration with cybersecurity and risk experts is needed to help establish an environment where the technology can be used safely, securely, and productively, according to Emm. Hurdles to adopting AI persist. Those issues include high costs, uncertain return on investment (ROI), the need to upskill entire staffs, and potential exposure of sensitive corporate data to unfamiliar automation technology. Few organizations, however, have put appropriate safeguards in place to guard against some of genAI's most well-known flaws, such as hallucinations, exposure of corporate data, and data errors. Most are leaving themselves wide open to the acknowledged risks of using genAI, according to Kaspersky. For example, only 22% of C-level executives have discussed putting rules in place to regulate the use of genAI in their organizations — even as they eye it as a way of closing the skills gap. Cisco CIO Fletcher Previn, whose team is working to embed AI in back-end systems and products, said it's critical to have the policies, security, and legal guardrails in place to be able to "safely adopt and embrace AI capabilities other vendors are rolling out into other people’s tools.


State of Serverless Computing and Event Streaming in 2024

Traditional stream processing usually involves an architecture with many moving parts managing distributed infrastructure and using a complex stream processing engine. For instance, Apache Spark, one of the most popular processing engines, is notoriously difficult to deploy, manage, tune and debug (read more about the good, bad and ugly of using Spark). Implementing a reliable, scalable stream processing capability can take anywhere between a few days and a few weeks, depending on the use case. On top of that, you also need to deal with continuous monitoring, maintenance and optimization. You may even need a dedicated team to handle this overhead. All in all, traditional stream processing is challenging, expensive and time consuming. In contrast, serverless stream processing eliminates the headache of managing a complex architecture and the underlying infrastructure. It’s also more cost effective, since you pay only for the resources you use. It’s natural that serverless stream processing solutions have started to appear. 


The Glaring Gap in Your Cybersecurity Posture: Domain Security

Because domain names are used for marketing and brand initiatives, security teams may feel that protecting online domain names falls under the marketing or legal side of the business. Or, they may have left domain protection in the hands of their IT department. But, if organizations are unfamiliar with who their domain registrars even are, chances are they are unaware of the policies the registrars use and the security measures they have in place for branded, trademarked domains. Domain security should be an essential branch of cybersecurity, protecting brands online, but it is not always the highest priority for consumer-grade domain registrars. Unfortunately, adversaries are privy to the growth in businesses’ online presence and the often minimal attention given to domain security, leading them to take a special interest in targeting corporate and/or government domain names that are left exposed. Organizations will continue to find themselves in the path of a perfect storm for domain and DNS attacks and potential financial or reputational devastation if they continue to allow the build-up of blind spots in their security posture.


Put guardrails around AI use to protect your org, but be open to changes

While a seasoned CISO might recognize that the output from ChatGPT in response to a simple security question is malicious, it’s less likely that another member of staff will have the same antenna for risk. Without regulations in place, any employee could be inadvertently stealing another company’s or person’s intellectual property (IP), or they could be delivering their own company’s IP into an adversary’s hands. Given that LLMs store user input as training data, this could contravene data privacy regulations, including GDPR. Developers are using LLMs to help them write code. When this is ingested, it can reappear in response to a prompt from another user. There is nothing that the original developer can do to control this because the LLM was used to help create the code, making it highly unlikely that they can prove ownership of it. This might be mitigated by using a GenAI license which helps enterprises to guard against their code being used as an input for training. However, in these circumstances, imposing a “trust but verify” approach is a good idea.


Why Generative AI Threatens Hospital Cybersecurity — and How Digital Identity Can Be One of Its Greatest Defenses

Writing convincing deceptive messages isn’t the only task cyber attackers use ChatGPT for. The tool can also be prompted to build mutating malicious code and ransomware by individuals who know how to circumvent its content filters. It’s difficult to detect and surprisingly easy to pull off. Ransomware is particularly dangerous to healthcare organizations as these attacks typically force IT staff to shut down entire computer systems to stop the spread of the attack. When this happens, doctors and other healthcare professionals must go without crucial tools and shift back to using paper records, resulting in delayed or insufficient care which can be life-threatening. Since the start of 2023, 15 healthcare systems operating 29 hospitals have been targeted by a ransomware incident, with data stolen from 12 of the 15 healthcare organizations affected. This is a serious threat that requires serious cybersecurity solutions. And generative AI isn’t going anywhere — it’s only picking up speed. It is imperative that hospitals lay thorough groundwork to prevent these tools from giving bad actors a leg up.


15 Essential Data Mining Techniques

The essence of data mining lies in the fundamental technique of tracking patterns, a process integral to discerning and monitoring trends within data. This method enables the extraction of intelligent insights into potential business outcomes. For instance, upon identifying a sales trend, organizations gain a foundation for taking strategic actions to leverage this newfound insight. When it’s revealed that a specific product outperforms others within a particular demographic, this knowledge becomes a valuable asset. Organizations can then capitalize on this information by developing similar products or services tailored to the demographic or by optimizing the stocking strategy for the original product to cater to the identified consumer group. In the realm of data mining, classification techniques play a pivotal role by scrutinizing the diverse attributes linked to various types of data. By discerning the key characteristics inherent in these data types, organizations gain the ability to systematically categorize or classify related data. This process proves crucial in the identification of sensitive information


SolarWinds lawsuit by SEC puts CISOs in the hot seat

Without ongoing, open dialogue between these leaders, it’s impossible to guarantee complete awareness of the range of complications associated with potential cyber risks. Now that we’ve seen how these risks can easily extend beyond security concerns and into catastrophic financial and legal issues, it’s important that conversations about these risks are not taking place exclusively among CISOs. The roles and responsibilities of CISOs and other C-Suite executives vary dramatically, which can naturally result in siloed processes and priorities. However, to ensure alignment and effectively protect an organization from data breaches and legal recourse alike, it’s imperative that business leaders learn to “speak the same language” and share information to align their efforts and goals. CFOs and CISOs must collaborate to evaluate the relationships between cybersecurity incidents and legal risks. We can facilitate this by leveraging cyber risk quantification and management tools, which congregate data to calculate, quantify and translate information about threats and vulnerabilities into lay terms and easily digestible data.


CTO interview: Greg Lavender, Intel

“Our confidential computing capability is also a privacy-ensuring capability,” says Lavender. “Europe is ahead in this area, with the notion of sovereign clouds. Intel partners with some of the European governments on sovereign cloud using Intel’s platforms for confidential computing. The privacy-preserving capabilities are built into these platforms, which beyond government, will also be useful in regulated industries like financial services, healthcare and telcos.” “We also see a convergence in AI that will open up a big market for our privacy-ensuring software and hardware,” says Lavender. “You spend a lot of time prepping your data, tagging your data, getting your data ready for training, usage or inference usage. You want to do that securely in a multi-tenant environment. Our platforms give you the opportunity to do your training securely between the CPU and the GPU, and then you can deploy it securely in the cloud or at the edge.” “I’m talking with a lot of CIOs about this technology, because data is now such a valuable thing. It’s what you use to train your models. You don’t want somebody else to get access to that data because then they can use it to train their models and offer competing services.”



Quote for the day:

"Success is the progressive realization of predetermined, worthwhile, personal goals." -- Paul J. Meyer

Daily Tech Digest - April 04, 2019

Is it too soon for AI in the education landscape?


Even if schools did have enough money, not only is their choice of software limited, but many heads and teachers are neither trained nor qualified to either select or use even basic educational technology, let alone AI tools. There is also a widespread fear of the unknown, part of which includes the much-discussed issue of jobs being automated out. Another major concern relates to ethics, believes Elena Sinel, who is a member of the All Parliamentary Group on AI and also founder of Acorn Aspirations and Teens in AI, which provide various forums for young people to learn tech skills. A key challenge in this context is in ensuring AI does not end up doing “more harm than good”, she says. “So it’s about looking at who is accountable if things go wrong – for example, what happens if there’s a data leak and who is ultimately in charge of the data? Or what happens if AI doesn’t assess students fairly or accurately in exams, for instance?” says Sinel. Such questions also fit into a wider debate around whether schools are currently set up to provide young people with the skills required for the workplace of the future, or whether fundamental change is required.



Prepare Now for Next-Generation Cyber Threats

Impacts will be felt across a range of industries. Malicious attacks may result in automated vehicles changing direction unexpectedly, high-frequency trading applications making poor financial decisions, and airport facial recognition software failing to recognize terrorists. Where machine learning systems are compromised, organizations will face significant financial, regulatory, and reputational damage, and lives will be put at risk. Nation states, terrorists, hacking groups, hacktivists, and even rogue competitors will turn their attention to manipulating machine learning systems that underpin products and services. Attacks that are undetectable by humans will target the integrity of information. Widespread chaos will ensue for those dependent on services powered primarily by machine learning. Companies should assess their offerings and dependency on machine learning systems before attackers exploit related vulnerabilities.



Rethinking reskilling: How to find key hidden talent within your organization  


To overcome the talent gap and foster adaptive workforces able to keep up with ongoing transformations in tech and industry, there is a clear need to shift from traditional L&D techniques like seminars and online training sessions, to leveraging existing experts within the organization so that we harness the collective intelligence of individuals and teams. These are employees who often already have the skills and knowledge that others need and follow the development of those fields closely. As a result, they can curate and contextualize that knowledge better than any external teacher, hence making it easier to for others to absorb it. Companies also need to tap what can be a hidden resource of knowledge, identifying “invisible” go-to resources; i.e., knowledgeable employees who may be currently unrecognized or perhaps are not even hierarchically high in the company structure, but seem to be go-to people for large networks of employees. Organizations can consider practices similar to Genpact’s Genome reskilling initiative, which uses advanced human network analysis techniques to identify these invaluable knowledge leaders outside of the usual suspects of widely known company subject matter experts.


A Framework for High-Value Big Data

More and more companies are achieving the monetization of data by improving efficiencies, developing new products, growing new markets, and by reducing risks. Saxena talked about Netflix's original series like Orange is the New Black that are a direct result of data-driven innovation. She elaborated on the big data framework elements. Organization maturity is about hard assets in an organization, like its strategy, data, quality etc. Every organization should have a business strategy, as well as a data strategy. The internal competencies are about people, and focus on soft assets like leadership, engagement, and adaptability. Health care organizations in the field of precision health like Geisinger are taking advantage of big data and genomic sequencing to transform healthcare practices, in order to prevent people from becoming sick and to treat people more as individuals (customers), rather than just patients. Data governance initiatives should include aspects of data integration, quality, accessibility and data security.


Leading DevOps program Chef goes all in with open source


What does that mean for Chef's customers? Jacob said, "Chef Software produces only open-source software projects, in the commons. It distributes that software as an enterprise product. For current Chef Software customers, nothing changes. For enterprise users of Chef products who are not customers, they can decide to either pay for Chef's distribution, or they can make or consume an alternative." Going deeper in the new Chef FAQ, Chef stated: "We will begin to attach commercial license terms to our software distribution (binaries) with the next major release." So, if you download and compile the code yourself, you're welcome to use it. But, if you download the binaries, you'll must pay for them. If that sounds familiar, it should. It's a variation of how Red Hat and SUSE, for example, release their enterprise Linux distributions.  . . . For existing commercial customers there will be no immediate changes until their next renewal when they will get licensed onto new SKU's representing the same core products."


Bitcoin, BlackRock And The Rise Of Alternatives

Bitcoin
As an alternative asset, the appeal of crypto is that its movements are uncorrelated with the rest of the market, says Mark Yusko, CEO of Morgan Creek Capital Management, which oversees $1.5 billion in assets, including a $40 million blockchain-focused VC fund. “Stocks or bonds derive their value from factors like GDP growth, profitability and interest rates. A cryptocurrency network derives its value from usage growth, adoption, regulation and technology. All of those things are uncorrelated with traditional measures of stocks and bonds.” ... Yusko claims inbound interest from institutional investors is growing. This week, he’s meeting with a California municipal pension fund. He adds that more institutional-investor conferences are including talks on cryptocurrencies. Teddy Fusaro, chief operating officer of Bitwise, a San Francisco digital asset manager and creator of the first crypto index fund, says institutional investors are showing increasing sophistication. “A year ago,” he says, “the conversation might have been, ‘How do we know bitcoin is going to survive?’ Or ‘Who is the CEO of bitcoin?’


6 Essential Skills Cybersecurity Pros Need to Develop in 2019

Image Source: Adobe Stock (vchalup)
On their face, these stats may engender a bit of complacency from cybersecurity professionals. It would only be natural to figure that anybody with a pulse and some security experience has got it made. But here's the rub. Many disruptive forces are at play that are set to drastically change the way security duties are carried out in the coming years. New security automation platforms, new architectures, and complex hybrid cloud implementations require major shifts in bread-and-butter security technical knowledge. Not only is security technology changing rapidly, but so are many of the fundamental roles held by cybersecurity professionals. Tons of emerging technologies and pervasive use of the Internet of Things are touching every aspect of business operating models, and software delivery is becoming more agile and embedded into lines of business. As a result, security pros are tasked to take positions requiring more consultative leadership and more enablement of democratized security across the organization.


What Is a Scaleup Company and How Is It Different from a Startup?

mimi-thian-scaleup-startup-company-definition-vc-investment-article-explanation-two-women-working-together-laptops
From a venture capital and entrepreneurial perspective, a scaleup company is considered to be in a later growth phase, after successfully maneuvering through the period of being a startup and having established a sustainable business model with a positive outlook on organizational growth and improvements of the profitability. For additional information on this aspect, you can also have a look at “How to Upscale Like a Boss“. It does not take much to “found” a startup company. Anybody with an interesting idea can register a company which then could be considered to be a startup. It then either fails or becomes successful after a lot of hard work. The question is more on… When does a company stop being a startup? As soon as the startup company has finished an MVP (minimum viable product) and has a stable monthly income, which is hopefully more than the company’s expenses, the organization ceases to be a startup. And that’s a good thing. Being a startup company is nothing good and nothing aspirational. To read more about the exit of this phase, you can also read our article “When Does a Company Stop Being a Startup?“.


Joining Human And Artificial Intelligence

Human and artificial intelligence
Although the aim of AI is to imitate HI to the point where both are indistinguishable, AI and HI are fundamentally different. Human intelligence learns via the senses and past experiences. They are also emotionally intelligent, which is something that AI is yet to crack. But AI is analytical and logical in a way that humans aren’t, and with this, it is capable of formulating and processing in ways that humans can’t. AI can take huge datasets and whittle them down to snippets of relevant information quickly. It can complete tasks in minutes as opposed to days, and it can identify data discrepancies that humans would never spot. Artificial and human intelligence is a match made in business heaven. The AI-HI model is already in practice across a number of sectors. In healthcare, clinical decisions are aided by artificially intelligent systems that search through historical data at a pace that human professionals never could. But, that said, getting a diagnosis direct from AI would be a very different experience to getting it from a doctor or nurse. Naturally you need both – AI augmenting human intelligence can lead to increased efficiency and accuracy.


How the data mining of failure could teach us the secrets of success

Since learning should reduce the number of attempts required before achieving success, it should lead to a narrower distribution of failure streaks than the exponential form predicted by the chance model. But to the surprise of Yin and co, failure streaks do not follow this pattern either. In fact, they have a much fatter-tailed distribution. “These observations demonstrate that neither chance nor learning alone can explain the empirical patterns underlying failures,” the researchers say. So what other factors are important? To find out, Yin and co modeled the way people learn from experience and how this influences their next attempt. In particular, they modeled whether people take into account all their previous experiences or just some of them. The resulting model considers a complete range of learning—from agents who take all their past experience into account to those who do not take any of their past experience into account, and everything in between. The team say the model predicts a phase change in the behavior that matches the empirical data.



Quote for the day:


"Coaching isn't an addition to a leader's job, it's an integral part of it." -- George S. Odiorne


Daily Tech Digest - March 09, 2019

Misconceptions about the term RPA: would removing a letter from the acronym help?

Misconceptions about the term RPA: would removing a letter from the acronym help? image
Removing the ‘robotic’ term may help to alleviate fears of robots taking over; but according to Jon Clark, proposition development at ActiveOps, it is the word ‘process’ which is the problem. “A process can be very wide-ranging and complex and the type of robots we are seeing automate ‘tasks’ within a ‘process’, so I think the ‘P’ in RPA is part of the problem, not the ‘R’. This is a subtle distinction but creates a challenge in terms of perception,” he says. The process of a credit card application for example, is made up of a series of steps such as checking details, credit scores, updating systems, sending confirmation emails and instructing the card printer. “That’s important because people tend to hear ‘process automation’ and think the whole thing will be automated. Unfortunately, it’s not that simple because robots aren’t yet able to do every task in the process,” he states. However, many within the industry believe that the RPA term should remain, and that changing any of the words could cause more problems that it solves.


Online voting: Now Estonia teaches the world a lesson in electronic elections

Voting online, or i-voting, as it is often called in Estonia, takes place during the advance voting period that runs from the 10th until the fourth day before the election. It is not possible to i-vote on election day. The voting process itself is fairly simple. The voter needs a computer with an internet connection and a national ID card or a mobile ID with valid certificates and PIN codes. Once the voting application is downloaded, the software automatically checks if the voter is eligible to cast a ballot and displays the list of candidates according to the region where the voter is registered. After voters make their decision, the application encrypts their vote and it is securely sent to the vote-collecting server. Every vote receives also a timestamp, so if necessary, it is possible to verify later whether the vote was forwarded to the collecting server. As i-voting doesn't take place in a controlled environment like a polling station, the authorities have to ensure that the vote has been freely cast. So, voters can change their choice during the advance voting period digitally or at a polling station, and then the last vote given is the one that counts.


Triton is the world’s most murderous malware, and it’s spreading


The malware made it possible to take over these systems remotely. Had the intruders disabled or tampered with them, and then used other software to make equipment at the plant malfunction, the consequences could have been catastrophic. Fortunately, a flaw in the code gave the hackers away before they could do any harm. It triggered a response from a safety system in June 2017, which brought the plant to a halt. Then in August, several more systems were tripped, causing another shutdown. The first outage was mistakenly attributed to a mechanical glitch; after the second, the plant's owners called in investigators. The sleuths found the malware, which has since been dubbed “Triton” (or sometimes “Trisis”) for the Triconex safety controller model that it targeted, which is made by Schneider Electric, a French company. In a worst-case scenario, the rogue code could have led to the release of toxic hydrogen sulfide gas or caused explosions, putting lives at risk both at the facility and in the surrounding area.Gutmanis recalls that dealing with the malware at the petrochemical plant, which had been restarted after the second incident, was a nerve-racking experience.


Blockchain marches steadily into global financial transaction networks

Chains of binary data.
SWIFT is among a groundswell of financial services firms testing blockchain as a more efficient and transparent way of conducting cross-border financial transactions, unhampered by much of the regulatory oversight to which current networks must adhere. SWIFT may also be feeling pressure as more and more firms in financial services pilot, or outright adopt, DLT technology. "There is a lot of competition now," said Avivah Litan, Gartner vice president of research. "If you think about SWIFT, it was just a big banking network that moved money quickly and authenticated users, but it costs a lot to do that. And now there are competing initiatives using blockchain." Litan pointed to J.P. Morgan Chase, CLS Group and Ripple, a permissioned blockchain ledger that moves money using a proprietary cryptocurrency, as prime examples of those developing blockchain for cross-border financial transfers. "Ripple is a competitor in the sense that they are trying to set up a bank-to-bank network," Litan said.


GDPR: Still Plenty of Lessons to Learn

GDPR: Still Plenty of Lessons to Learn
During the RSA panel, security expert Ariel Silverstone reported that as of the end of January, there were 41,000 breaches reported under GDPR that fell within the 72-hour notification window. Additionally, there have been about 250 investigations by the various data protection authorities. Silverstone noted that while GDPR involves all 28 countries of the EU, variations in how each country is implementing the law mean companies could face different penalties. For instance, he described that Germany's interpretation of the law makes a violation nearly a criminal case, while other nations have been reducing fines. Silverstone also pointed out that the California Consumer Privacy Act, which adheres to some of the same principals as GDPR, is offering some of the same consumer protections that Europeans now enjoy. Mark Weatherford, the global information security strategist at Booking Holdings, told the audience that while complying with the GDPR rules is difficult, it's not impossible. Before his current job, he worked at a startup that needed to come into compliance.



A Practical Intro to Kotlin Multiplatform

Kotlin has enjoyed an explosion in popularity ever since Google announced first-class support for the language on Android, and Spring Boot 2 offered Kotlin support. You’d be forgiven for thinking that Kotlin only runs on the JVM, but that’s no longer true. Kotlin Multiplatform is an experimental language feature that allows you to run Kotlin in JavaScript, iOS, and native desktop applications, to name but a few. And best of all, it’s possible to share code between all these targets, reducing the amount of time required for development. This blog post will explore the current state of Kotlin Multiplatform by building a simple app that runs on Android, iOS, Browser JS, Java Desktop, and Spring Boot. Maybe in a few years, Kotlin will be a popular choice on all these platforms as well. ... To share Kotlin code between platforms, we’ll create a common module that has a dependency on the Kotlin standard library. For each platform, we’ll support the need to create a separate module that depends on the common module and the appropriate Kotlin language dependency.


How Daimler is using graph database technology in HR


For us, we could see advantages to using graph technology in HR projects because HR data is not isolated, so you don't normally have one person working without a connection to another person. If you look at a company, every time you look at the people working in the company you will see that they all have a connection to other people working in the company, you won't see anybody who is completely isolated. That is one of the reasons why we thought that HR data might be a very good fit with a graph data model. We have started with trying to understand what graph and HR data have in common. ... The second reason, and it's a concrete reason why we created this structured application, is that we created our Leadership 2020 programme at Daimler. We are transforming as a company from the classical, hierarchical structure to a mixture of classic hierarchies and what is called a 'swarm' which is a mixture of the same people working on the same project but coming from different departments and different hierarchies.


Blockchain boosters warn that regulatory uncertainty is harming innovation

Businesses and consumers are reluctant to develop and use blockchain applications in the face of uncertainty over whether they might violate outdated financial laws, the Chamber of Digital Commerce argues in its “National Action Plan” (PDF). Among other things, it calls for “clearly articulated and binding statements from regulators regarding the application of law to blockchain-based applications and tokens.” On Wednesday at the DC Blockchain Summit, SEC commissioner Hester Peirce warned industry advocates to be careful what they wish for. Peirce called the action plan “helpful” and agreed that clear regulatory guidelines are needed. But she cautioned against expecting the government to try to foster innovation, which she said could do more harm than good. Peirce urged patience and cooperation. Regulators are slow, she said, and this technology is complicated: “There’s a learning curve. People at the SEC are trying to learn about this space, and trying to understand where the pressure points are.”


2 reasons a federated database isn’t such a slam-dunk

2 reasons a federated database isn̢۪t such a slam-dunk
First, performance. You can certainly mix data from an object-based database, a relational database, and even unstructured data, using centralized and virtualized metadata-driven view. But your ability to run real-time queries on that data, in a reasonable amount of time, is another story. The dirty little secret about federated database systems (cloud or not) is that unless you’re willing to spend the time it takes to optimize the use of the virtual database, performance issues are likely to pop up that make the use of a federated database, well, useless. By the way, putting the federated database in the cloud won’t help you, even if you add more virtual storage and compute to try to brute-force the performance. The reason is that so much has to happen in the background just to get the data in place from many different databases sources. These issues are fixed typically with figuring out good federated database design, tuning the database, and placing limits on how many physical databases can be involved in a single pattern of access. I’ve found that the limit is typically four or five.


How to use process data mining to improve DevOps

Process mining is the data-driven improvement of business processes, and data scientists often use it to suggest ways to enhance performance. Process data mining works for companies and DevOps teams with processes in place, as well as those that still need to create processes. In the first case, people can compare the best practices for their process with what regularly happens within the team. But, individuals at the enterprise level can also use process data mining to establish their processes. Information sources such as event logs give details about how and when people use tools. Process data mining shows people how far away they are from the target of an ideal process, which can also mean it helps people solidify the processes a DevOps team follows. Then, it’s possible to know how to make the most meaningful process-related improvements and discover the things going wrong. ... Process data mining allows for real-time data collection. The companies that successfully use DevOps rely on release cycle metrics that tell them about progress and quality levels.



Quote for the day:


"Strong convictions precede great actions." -- James Freeman Clarke


Daily Tech Digest - February 21, 2019

Data Mining — What, Why, How?


Data mining sits at the intersection of statistics (analysis of numerical data) and artificial intelligence / machine learning (Software and systems that perceive and learn like humans based on algorithms) and databases. Translating these into technical skills leads to requiring competency in Python, R, and SQL among others. In my opinion, a successful data miner should also have a business context/knowledge and other so called soft skills (team, business acumen, communication etc.) in addition to the above mentioned technical skills. Why? Remember that data mining is a tool with the sole purpose of achieving a business objective by accelerating the predictive capabilities. A pure technical skill will not accomplish that objective without some business context. The following article from KDNuggets proves my point that data mining job advertisements mentioned the following terms very frequently: team skills, business acumen, analytics among others. The same article also has SQL, Python and R at the top of the list as technical skills.



Two Sides of a Coin: Blockchain, Ethics and Human Rights

What does it mean to say that a technology is evil? Given Krugman’s arguments, it’s easy to see what he meant: bitcoin is used exclusively for acts which are morally bad; hence, bitcoin is itself evil. As an ethical argument, this is willfully ignorant; you don’t need a Nobel Prize to find examples of blockchain being used for social good. But, interestingly, the underlying thought pattern – that bitcoin is evil because it brings about bad consequences– is an example of a legitimate moral theory known as consequentialism. If Krugman was arguing along consequentialist lines, his error lies in disregarding bitcoin’s positive aspects and in the failure to make the assumption of this ethical framework explicit.  Intrigued, we started searching the academic databases for ethical frameworks applied to blockchain, but found nothing. Yet we kept finding controversies surrounding certain blockchain use cases which relied implicitly on the ethical frameworks that philosophers have developed over thousands of years.


Zuckerberg Eyeing Blockchain For Facebook Login And Data Sharing


In the interview, Zuckerberg said that authentication was a use of blockchain that he is potentially interested in. However, he caveated it by saying: “I haven’t found a way for this to work.” He added: “You basically take your information, you store it on some decentralized system, and you have the choice of whether to log in in different places, and you’re not going through an intermediary.” “There’s a lot of things that I think would be quite attractive about that. For developers, one of the things that is really troubling about working with our system, or Google’s system for that matter, or having to deliver services through Apple’s App Store is that you don’t want to have an intermediary between serving the people who are using your service and you.” “Where someone can just say 'hey, we as a developer have to follow your policy and if we don’t, then you can cut off access to the people we are serving'. That’s kind of a difficult and troubling position to be in.”


Power over Wi-Fi: The end of IoT sensor batteries?

Power over Wi-Fi: The end of IoT sensor batteries?
The researchers believe that harvesting 150 microwatts of power (the power level of a typical Wi-Fi signal) with one of the rectennas could produce around 40 microwatts of electricity—enough to power a chip. Scaling the system to a vehicle, data center hall, or similar-sized setup, which they say is possible in part because their MoS2 material is thin and flexible, would conceivably generate commensurate power. The researchers also say the non-rigid, battery-free system is better than others’ attempts at rectennas because they capture “daily” signals such as “Wi-Fi, Bluetooth, cellular LTE, and many others," says Xu Zhang, of collaborator Carnegie Mellon University, in the article. The other Radio Frequency-to-power converters, which are thick and non-flexible, aren’t wideband enough, the groups say. Of course, radio waves already power some chips. RFID tags are an example. But those solutions are limited in their power and, therefore, range and bandwidth, which is why the search is on for something better.


UK committed to working with EU cyber security partners


Within the cyber security sphere, Martin said it was “objectively true” that nearly all the functions of the NCSC fall outside the scope of EU competence. “It follows that our enhanced cooperation with European partners, and the EU as a whole, in cyber security over recent years is not automatically affected by the UK’s changing relationship with the EU,” he said. “Pretty much everything we do now to help European partners, and what you do to help us, on cyber security can, should, and I am confident will, continue beyond 29 March.” In the past, said Martin, the UK has shared classified and other threat data with EU member states and institutions and played a role in the development of European thinking in areas such as standards and incident response.


What organizations can do to mitigate threats to data management

Adding granular encryption with BYOK (Bring Your Own Key) is an effective weapon in breach prevention. If even an administrator or engineer who manages data in an organization cannot read that data, a hacker will be stopped cold – he may be effective in stealing the data, but not in using it for his own gain. Threats to cybersecurity are considerable and are becoming worse with the proliferation of big data and its use in AI. Good practices raise awareness of cybersecurity risks and help organizations create robust, reliable and fast disaster recovery plans (DRPs) in advance. And, organizations can gain by using AI to monitor systems, detect vulnerabilities, and bridge those vulnerabilities, turning AI into a strategic asset. Many organization's cloud data environments lack the technology for the effective automation of data privacy compliance, and they find it challenging to meet the requirements of the most stringent regulation for data protection, GDPR.


How to recruit top tech talent: Do's and don'ts

hrhiringrecruitingistock-923039588fizkes.jpg
Dice Editor Nate Swanner said they were surprised that remote work rated so highly on the list and added that "tech pros can see through the pizazz: A flashy job title, dedicated parking spot and a fresh MacBook Pro won't cumulatively overcome great health benefits or remote work." Research firm Gartner has found that things may not be so simple, though: Benefits like healthcare may be highly desired, but they're also basic expectations for job seekers. "Instead, candidates want to know which benefits set the organization apart," Gartner said, noting that educational benefits, well-being initiatives, and innovative perks are far more likely to attract top talent. Giving credence to Gartner's argument is its research on the types of benefits mentioned in a job posting v. how much time that posting remains up. Mentions of medical care, employee well-being, and work-life balance had zero impact on how long a posting goes unfilled, while dental/vision coverage, financial benefits, family programs, and disability/life insurance all significantly reduced the amount of time it took to fill a job.


Move over HR: Why tech is taking charge of company culture

The key lesson, says Lewis, is that the broader organisation sees the plus-points that a new way of working brings and then demands similar benefits. "In the same way that it happened in the IT industry in terms of Scrum and Agile, I think people have started to realise that smaller, cross-functional teams can add value in other areas of the business," he says. Lewis, therefore, posits a change in perception, one that holds non-IT executives are recognising that digital chiefs have broad expertise that can help change the business for the better. Board members who call on their CIOs for advice on people and processes find new ways to overcome the cultural challenges associated to transformation. That view resonates with Brad Dowden, interim CIO and director at Intercor Transformations. He says the experience digital leaders have of running transformation programmes definitely leaves them well-placed to advise the rest of the organisation — including HR chiefs — about the best ways to pursue successful culture change initiatives.


Breaking the chains: How FUD is holding the cyber sector hostage


The biggest cyber danger for companies is not the CFO getting hacked by Chinese wizard-class hackers using an offensive AI-driven quantum virus via blockchain – it’s someone from the accounts team, clicking on that phishing email link because he did his mandatory corporate security training seven months ago and has forgotten to double-check the URL. It could also be someone from the development team facing a tight deadline and nabbing some code from GitHub, without having the time to really read through it and find that remote shell buried in line 2,361. Suppliers can hype and sensationalise the capabilities of their products, and the scale of the threat, but ultimately all they are doing is damaging customers’ trust – the trust that is vital for a company to know that its cyber security strategy is based on a proportional and relevant response to the threats it faces as an organisation.


Using Contract Testing for Applications With Microservices

What makes contract testing awesome is that it does this in a way which really fits well into a microservice workflow, said Groeneweg. The most important thing is that it decouples the test between the service who’s using the API (consumer) and the API itself (provider). This allows you to bring them both to production without needing the other. It’s especially useful when they are maintained by different teams because it enabled them to be autonomous in testing and releasing.
Groeneweg stated that contract testing is a way of reducing the risk of integration bugs. Also, contract testing is a lot faster than other ways of integration testing. That’s important as it allows you to decrease lead time and kill waste which is caused by slow feedback from tests, he said. As the consumer defines the contract, contract testing also leads to better interfaces and APIs that are actually used.



Quote for the day:


"The key to successful leadership today is influence, not authority." -- Ken Blanchard


Daily Tech Digest - December 29, 2018

Facebook's and social media's fight against fake news may get tougher

Filippo Menczer, a professor of informatics and computer science at Indiana University who's studied how automated Twitter accounts spread misinformation, said that because of the lack of available data, it's hard to tell if fake news is being spread through ephemeral content.  "Even the platforms themselves don't want to look inside that data because they're making promises to their customers that it's private," Menczer said. "By the time someone realizes that there's some terrible misinformation that's causing a genocide, it may be too late." Snapchat, which started the whole ephemeral content craze, appears to have kept itself mostly free of fake news and election meddling. The company separates news in a public section called Discover. Snapchat's editors vet and curate what shows up in that section, making it difficult for misinformation to go viral on the platform.


Google's E-Money License And The 8 Reasons Why Bankers Are Relaxed

Thinking of the bank as a provider of products makes it seem as an illustriously big deal that the e-money providers can't offer loans or interest on balances but in effect, when you think of the endless possibilities of contextual MoneyMoments, it is only payments and transfers that offer them and those are firmly possible without a full banking license. ... "But that's not their core business" is one of the most thrown-around phrase of soothing consolation when it comes to discussing any big technology giant entering the financial services arena. That just seems to be firmly outside of the realm of possibility that they would be interested in anything other than search, Prime delivery or spying on our private conversation but it's a healthy exercise to at times recall that the "core business" purpose of any of these companies, is, as it is for the banks themselves - turning a profit.


Microsoft’s ML.NET: A blend of machine learning and .NET


The ultimate tech giant, Microsoft, recently announced a top-tier open source and cross-platform framework. The ML.NET is built to support model-based machine learning for .NET developers across the globe. It can also be used for academic purposes along with the research tool. And that isn’t even the best part. You can also integrate Infer.NET to be a part of ML.NET under the foundation for statistical modeling and online learning. This famous machine learning engine – used in Office, Xbox and Azure, is available on the GitHub for downloading its free version under the permissive MIT license in the commercial application. The Infer.NET helps to enable a model-based approach to the machine learning which lets you incorporate domain knowledge into the model. The framework is designed to build a speak-able machine learning algorithm directly from that model. That means, instead of having to map your problem onto a pre-existing learning algorithm, Infer.NET actually constructs a learning algorithm based on the model you have provided.


An Intro to Data Mining, and How it Uncovers Patterns and Trends

Data mining is essential for finding relationships within large amounts and varieties of big data. This is why everything from business intelligence software to big data analytics programs utilize some form of data mining. Because big data is a seemingly random pool of facts and details, a variety of data mining techniques are required to reveal different insights. Our example from earlier explains how data mining can segment customers, but data mining can also determine customer loyalty, identify risks, build predictive models, and much more. One data mining technique is called clustering analysis, which essentially groups large amounts of data together based on their similarities. This mockup below shows what a clustering analysis may look like. Data that is sporadically laid out on a chart can actually be grouped in strategic ways through clustering analysis.


Microsoft Announces a Public Preview of Python Support for Azure Functions

According to Asavari Tayal, program manager of the Azure Functions team at Microsoft, the preview release will support bindings to HTTP requests, timer events, Azure Storage, Cosmos DB, Service Bus, Event Hubs, and Event Grid. Once configured, developers can quickly retrieve data from these bindings or write back using the method attributes of your entry point function.Developers familiar with Python do not have to learn any new tooling; they can debug and test functions locally using a Mac, Linux, or Windows machine. With the Azure Functions Core Tools (CLI), developers can get started quickly using trigger templates and publish directly to Azure, while the Azure platform will handle the build and configuration. Furthermore, developers can also use the Azure Functions extension for Visual Studio Code, including a Python extension, to benefit from auto-complete, IntelliSense, linting, and debugging for Python development, on any platform.


keyboard-attack.png
This type of vulnerability --known as a side-channel attack-- isn't new, but it's been primarily utilized for recovering cleartext information from encrypted communications. However, this new side-channel attack variation focuses on the CPU shared memory where graphics libraries handle rendering the operating system user interface (UI). In a research paper shared with ZDNet and that will be presented at a tech conference next year, a team of academics has put together a proof-of-concept side-channel attack aimed at graphics libraries. They say that through a malicious process running on the OS they can observe these leaks and guess with high accuracy what text a user might be typing. Sure, some readers might point out that keyloggers (a type of malware) can do the same thing, but the researcher's code has the advantage that it doesn't require admin/root or other special privileges to work.


Top 10 overlooked cybersecurity risks in 2018

Most cyber attacks injure either the confidentiality or availability of data. That is to say, they are either spying on or disabling some system. But there is of course another option: attacks on integrity. If you found out your bank records were, even in some small way, remotely altered say… 18 months ago? How would that change your perception of the safety of keeping your money in the bank? What if 1 percent of the bottles of some over the counter medication had the formula altered to change efficacy, how would that affect your trust in the medical system? Subtle, these operations are hard to detect, harder to prove, and leave a lasting stigma of distrust and conspiracy even if caught. Already we see some criminal groups engaging in this sort of activity to modify gift cards and other forms of petty cyber larceny, which means that more sophisticated operations and nation-state challenges won’t be far behind.


You’ve Heard of IoT and AI, but What is Digital Twin Technology?


A digital twin is a highly advanced simulation that’s used in computer-aided engineering (CAE). It’s a digital duplicate that represents a physical object or process, but it is not intended to replace a physical object; it is merely to inform its optimization. Other terms used to refer to digital twin technology include virtual prototyping, hybrid twin technology, and digital asset management, but digital twin is quickly winning out as the most popular name. Both NASA and the United States Air Force are planning on using digital twin technology to create future generations of lightweight vehicles that are sturdy and able to haul more than their current counterparts. Goldman Sachs recently examined digital twin technology in their series “The Outsiders,” which seeks to identify “emerging ecosystems on the edge of today’s investable universe.” 


10 Social Media Predictions for 2019

Storytelling emerged in 2018 as a core technique for engaging consumers. But up until now a lot of storytelling was stored on blogs and websites and then shared to social media. I see 2019 being the year when storytelling combined with augmented reality is hosted on the main social media platforms. I also see 2019 as the year when brands align their storytelling with enacting positive social change. Studies show that 92% of consumers have a more positive image of a company when it supports a social or environmental issue. And almost two-thirds of millennials and Gen Z express a preference for brands that stand for something. Nike nailed social media storytelling even before the emergence of sophisticated AR technologies. In its Equality campaign it focuses on social change and inspires people to act. The message: by wearing Nikes or even interacting with them on social media, you are supporting the movement.


China is racing ahead in 5G. Here’s what that means.

China sees 5G as its first chance to lead wireless technology development on a global scale. European countries adopted 2G before other regions, in the 1990s; Japan pioneered 3G in the early 2000s; and the US dominated the launch of 4G, in 2011. But this time China is leading in telecommunications rather than playing catch-up. In a TV interview, Jianzhou Wang, the former chairman of China Mobile, China’s largest mobile operator, described the development of China’s mobile communication industry from 1G to 5G as “a process of from nothing to something, from small to big, and from weak to strong.” Money is another good reason. The Chinese government views 5G as crucial to the country’s tech sector and economy. After years of making copycat products, Chinese tech companies want to become the next Apple or Microsoft—innovative global giants worth nearly a trillion dollars.



Quote for the day:


"What great leaders have in common is that each truly knows his or her strengths - and can call on the right strength at the right time." -- Tom Rath


Daily Tech Digest - March 10, 2018

Why You Should View Linux as a Core IT Skill

Linux as a Core IT Skill
Twenty-five years ago, some fellow students and I were sitting in a computer lab at the University of Waterloo trying to compile a new open-source UNIX operating system called Linux on a PC. Back then, installing a Linux system was about as difficult as nailing Jell-O to a tree, but we managed to get a system installed after only four days of work. Linux has come a long way since then. Today, Linux is the most diverse and aggressively developed operating system in the world, primarily due to its open-source nature. And if you work in an IT field, you’ve probably been exposed to more Linux in the last few years than before. In fact, the Gartner research company identified Linux as the fastest-growing operating system segment in the computing industry in 2017. So, what does this mean for you as an IT professional? It means that you’ll likely be working with far more Linux systems and technologies in coming years, regardless of whether you currently work with them or not.



Cisco attacks SD-WAN with software from Viptela, Meraki acquisitions

Cisco attacks SD-WAN with software from Viptela, Meraki acquisitions
The SD-WAN is typically made of diverse networks and technologies that many times are outside the control of IT. Add to that the increased use of multi-cloud services and other advances, and the traditional complexity of the WAN has been increased, Cisco stated. Cisco cited a recent IDC study that found almost three out of 10 organizations considered network outages to be a top WAN concern, with the same number stating they need better visibility and analytics to manage application and WAN performance. IDC also estimates that worldwide SD-WAN infrastructure and services revenues will hit $8.05 billion by 2021. In order to address some of these challenges, Cisco rolled out SD-WAN vAnalytics, a cloud-based SaaS application that will collect data from the SD-WAN and let customers spot and fix communications problems quicker, gauge application performance, oversee bandwidth planning, and predict how policy changes might impact the network. 


Big data analytics: The cloud-fueled shift now under way

Big data analytics: The cloud-fueled shift now under way
Cloud-based big-data silo convergence is speeding enterprise time-to-value. Users are beginning to step up the pace of consolidation of their siloed big data assets into public clouds. The growing dominance of public cloud providers is collapsing the cross-business silos that have heretofore afflicted enterprises’ private big data architectures. Just as important, big data solutions, both cloud-based and on-premises, are converging into integrated offerings designed to reduce complexity and accelerate time to value. More solution providers are providing standardized APIs for simplifying access, accelerating development, and enabling more comprehensive administration throughout their big data solution stacks. Innovative big data startups are bringing increasingly sophisticated AI-infused applications to market. Innovative application providers are starting to disrupt the big data competitive landscape with AI-based solutions.


Why Startup CEOs Still Have to Make Sales Calls

For all the obvious reasons. (1) People don't really care how much you know until they know how much you care. Showing up shows them that you actually do care. (2) Startups are notoriously scattered and in a hurry. Focus and attention to detail are scarce commodities and the customers want to know that you personally are connected, paying attention, and directly engaged with their business, their concerns and their problems. And finally, (3) they want to hear it from the horse's mouth. Not second hand. They want commitments and assurances from you (since they know that the sales guys will tell them anything and promise them the world) that you will stand up for and stand behind your product or service and make good on whatever they've been promised. The buck always stops with you. None of this is very tough. You just have to say what you're going to do and do what you said you would and everything will be hunky-dory.


What is a virtual CISO? When and how to hire one

multiple-exposure image showing virtual connections and software inside and outside a human profile
Why would you need a vCISO when you could simply hire a real one on a permanent contract? The answer is varied and not necessarily the same for everyone. For starters, well-rated, full-time CISOs can be hard to come by, often stay in their job for two years or less, and critically, especially for smaller businesses, can command six-figure salaries. In contrast, vCISOs are estimated to cost between 30 percent and 40 percent of a full-time CISO and are available on-demand. The benefits go well beyond cost. Virtual CISOs usually require no training, can hit the ground running, and don’t feel obliged to play nice with office politics. In this model, it’s purely about results, and vCISOs worth their salt will provide reasonable KPIs and reporting. While different vCISOs offer different skillsets, many should be able to cover myriad tasks, from the tactical to strategic. They could help pull together security policies, guidelines and standards. That could entail anything from coming to grips with HIPAA or PCI compliance, to staying on top of vendor risk assessment. 


Josh Bersin on the Importance of Talent Management in the Modern Workplace

Bersin reminds us that, even though the top, hot job of the moment may be technical, there are are plenty of non-technical jobs that are growing in demand, too. “Soft skills are just as in demand as hard skills. There will be an increased need for social, integrative, and hybrid skills. Empathy, communication, speaking, judgement… these renaissance skills are the jobs of the future,” said Josh. “Even the job of data scientist now requires persuasion, interpretation, not just looking at data.” Although many worry that technology will render some workers obsolete, this appears to be far from the case. Many of these workers can easily transition into new roles that leverage their skills, and these new roles are good for the workers, too. In fact, 96% of all transitions have “good-fit” options and 65% of transitions will increase wages.


Machine learning: What developers and business analysts need to know

Machine learning: What developers and business analysts need to know
In the case of supervised learning, you train a model to make predictions by passing it examples with known inputs and outputs. Once the model has seen enough examples, it can predict a probable output from similar inputs. ... The results of the prediction can’t be better than the quality of the data used for training. A data scientist will often withhold some of the data from the training and use it to test the accuracy of the predictions. With unsupervised learning, you want an algorithm to find patterns in the data and you don’t have examples to give it. In the case of clustering, the algorithm would categorize the data into groups. For example, if you are running a marketing campaign, a clustering algorithm could find groups of customers that need different marketing messages and discover specialized groups you may not have known about. In the case of association, you want the algorithm to find rules that describe the data.


Software leaders pick these three technologies as top investments

Companies that have been slower to invest in technology solutions have either prioritized changing their business model or have felt the negative, if not fatal, repercussions of not doing so. Regardless of industry, staying ahead of the technological curve in today’s software-centric world is a must for business success. However, it can be difficult for even the most experienced IT leaders to wade through the long list of technology buzzwords and solutions that promise to be the “next best things.” So how can businesses cut through the noise to determine what will actually bring business value? They can start by determining the technologies the experts are actually pursuing. To find out what these tech trends are, O’Reilly analyzed search data from more than two million users on its online learning platform, most of which are trained software and technology leaders. By taking into consideration what these professionals are focusing on, other professionals can begin to determine what their companies should be investing time and money in.


RoboTiCan is building low-cost industrial robots for the masses

​RoboTiCan products, with CEO Halgai Balshai
Balshai said, "We have moving, navigation, a manipulation of an arm, computer vision. Everything combined in one platform. Basically to be able to master all this knowledge and be able to find the algorithm for making it work is really complex. With ROS, it gives us a lot of opportunity to combine algorithms from one point to another. For example, if something was developed in a Carnegie Mellon University in the United States and we want to use this particular system, image work, or cognition of an object that was developed in Carnegie Mellon, we can extract this information and extract these ideas and implement it in our robot real easily. "By that, we don't need to have a really huge company to be able to do a lot of different tasks with one robot. This is basically the idea and the advantage of using ROS and open source architects for how we use robotics. By doing something that is generic for everybody, you can use it all over the globe. Of course, there is stuff that we extract to others. ..."


Data Mining What Why

Data mining sits at the intersection of statistics (analysis of numerical data) and artificial intelligence / machine learning (Software and systems that perceive and learn like humans based on algorithms) and databases. Translating these into technical skills leads to requiring competency in Python, R, and SQL among others. In my opinion, a successful data miner should also have a business context/knowledge and other so called soft skills (team, business acumen, communication etc.) in addition to the above mentioned technical skills. Why? Remember that data mining is a tool with the sole purpose of achieving a business objective (increase revenues / reduce costs) by accelerating the predictive capabilities. A pure technical skill will not accomplish that objective without some business context. The following article from KDNuggets proves my point that data mining job advertisements mentioned the following terms very frequently: team skills, business acumen, analytics among others.



Quote for the day:


"Vulnerability is the birthplace of innovation, creativity and change." -- Brené Brown