Daily Tech Digest - March 01, 2022

Using APIs with Low-Code Tools: 9 Best Practices

One of the best things about low- and no-code tools is their potential to get non-technical users involved in creating applications. But unless your non-technical colleagues understand what they can get out of using these tools — and unless they can use the tools without coding skills — it doesn’t matter which ones your organization adopts. “It’s all about users at the end of the day,” said Leonid Belkind, co-founder and chief technology officer at Torq, which provides a no-code security automation platform, “How many tools have you seen in your lifetime become shelfware? The organization bought it and nobody uses it. That’s the biggest risk. “How do you avoid it? Find out the motivation and goals people have and match the tool to it,” he added. If you put user needs first, “the chances of it becoming shelfware are significantly lower.” It’s important to not only find out users’ needs but ask them to explain how they now complete the tasks you’re trying to automate, Belkind said. Why is it important to identify who is going to work with the tool? he asked.


When NOT To Use Apache Kafka

If your application requires sub-millisecond latency, Kafka is not the right technology. For instance, high-frequency trading is usually implemented with purpose-built proprietary commercial solutions. Always keep in mind: the lowest latency would be to not use a messaging system at all and just use shared memory. In a race to the lowest latency, Kafka will lose every time. However, for the audit log and transaction log or persistence engine parts of the exchange, it is no data loss that becomes more important than latency and Kafka wins. Most real-time use cases "only" require data processing in the millisecond to the second range. In that case, Kafka is a perfect solution. ... Kafka is not a deterministic system. Safety-critical applications cannot use it for a car engine control system, a medical system such as a heart pacemaker, or an industrial process controller. ... Kafka requires good stable network connectivity between the Kafka clients and the Kafka brokers. Hence, if the network is unstable and clients need to reconnect to the brokers all the time, then operations are challenging, and SLAs are hard to reach.


5 Deadly Sins of Software Development

Curating software from a translation of codes that is executable by a computer and understandable by a human is not an easy task. Before jumping on the development tools, you must devote a fixed timeframe to understand your client’s business. Dig deep enough and understand HOW exactly is the software going to impact the workflow of the organization and the end-users. By doing so, you’ll get more clarity on what to work on and more importantly, what not to work on. Every software developer who has attained significant success will tell you to understand the resulting benefit of the software. This will allow you to only focus on stuff that holds value, while preemptively eliminating the most obvious changes that the client’s review team would recommend. So the next time you sit in front of your computer for a new software project, go through the project’s brief to comprehend the WHY of the software before you begin coding. Making the software eloquent and interactive for the user is what every developer strives for. But while doing so, you must take care that you don’t add too many features, which could eventually overwhelm the user. This is because a confused mind denies everything.


Here’s how algorithms are made

When an algorithm is implemented and verified against the ground truth, it becomes formulated into a mathematical object that can be later used in other algorithms. An algorithm must stand the test of time, prove its value in applications, and its usefulness in other scientific and applied work. Once proven, these algorithms become abstracted, taken as proven claims that need no further investigation. They become the basis and components of other algorithms, and contribute to further work in science. But an important point to underline here is that when the problem, ground-truth, and implementation are formulated into an abstract entity, all the small details, and facts that went into creating it become invisible and tend to be ignored. “If STS has long shown that scientific objects need to be manufactured in laboratories, the heavy apparatus of these locations as well as the practical work needed to make them operative tend to vanish as soon as written claims about scientific objects become certified facts,” Jaton writes in The Constitution of Algorithms.


How to empower IT Sec and Ops teams to anticipate and resolve IT problems

Runecast is a patented enterprise IT platform created for administrators, by administrators, and is tailored to the needs of those teams and enterprise leaders. Most importantly, though, it is a proactive platform aimed at helping IT admins anticipate potential problems before they become a headache and fix potential issues before they lead to service disruptions or exploitable vulnerabilities. The objective is reflected in the name of the company and the platform: casting (tossing) rune stones is how some cultures attempted to predict the future that would happen if no changes were made in the present. Runecast Analyzer does precisely this, and then provides actionable solutions to avoid damaging situations. Its power lies in Runecast AI Knowledge Automation (RAIKA), a technology that uses natural language processing (NLP) to crawl and analyze the previously mentioned mountain of available sources of unstructured knowledge to turn it all into machine-readable rules. RAIKA plugs into many different sources: knowledge base articles, online documentation, forums, blog posts, and even curated Twitter accounts of influencers.


How to Become a Data Scientist

Becoming a data scientist does not necessarily require a master’s degree. There is a significant shortage of data scientists, and some employers are comfortable hiring people who lack a degree, but have the experience needed. The majority of employed data scientists have a master’s degree, but over 25% do not. If you have the experience, a degree is not an absolute necessity to become employed as a data scientist. (If you are genuinely good at statistics, this may be a job for you. If you are not, by nature, good at statistics, this is probably not a job for you.) Data scientists process large amounts of data, often with the goal of increasing a business’ profits. Ideally, a data scientist has a strong understanding of statistics and statistical reasoning, computer languages, and business. They process and analyze large amounts of data to provide useful, meaningful information to their employers. These interpretations are used for decision-making. To provide this information, data scientists often work with messy, unstructured data, coming from emails, social media, and smart devices. 


Edge computing and 5G: What's next for enterprise IT?

When people talk about 5G, they’re usually referring to the major telco networks (and 5G-enabled devices that connect to those networks), which have begun rolling out and will expand considerably over time. Those networks have enterprise impacts, of course. But the “next big thing” for many businesses may be private 5G networks. It’s not a perfect comparison, but a private 5G network is kind of like a private cloud – an answer to the question (among others): “What happens if I want to leverage the technology while retaining as much control as possible?” “In addition to typical 5G, increasingly enterprises are evaluating private 5G models to transform specific parts of their business,” says Joshi, the Everest Group analyst. “This combined with edge devices can meaningfully change the way enterprises work.” Joshi points to use cases such as smart stadiums, connected fleets, autonomous vehicles, smart ports, and remote health as examples where interest is already abundant and the combination of private/public 5G networks and edge architecture could flourish.


A Quick Look at Advanced IoT Sensors for the Enterprise Going Digital

Machine vision is frequently used in EIoT solutions, especially to perform quality control of products. However, these vision systems are complex and rather expensive, which makes them much more difficult for smaller companies to implement. Today, they can be replaced with modern IoT sensors, as Denso showed. Denso has developed the smallest stereo vision sensor for use in cars to help prevent collisions. These vision sensors are implemented in smart cameras and can also be used for object recognition, manufacturing process control, and product quality assurance. Small, practical equipment can be installed in a factory to monitor a large number of production points. A sensor called Visionary-T DT developed by the company Sick can detect objects at a distance of up to 160 ft. It is a 3D video sensor that uses Time-of-Flight (TOF) technology to detect the presence or absence of 3D objects. Solutions developed using this technology are candidates to be chosen to ensure the security of the Enterprise, and to protect objects or areas. 


Anonymous Extends Its Russian Cyberwar to State-Run Media

Quantifying an uptick in cyber activity in Ukraine, Israeli firm Check Point said related attacks on Ukrainian government sites and its military increased by 196% in the first three days of the conflict. And as the situation on the ground has worsened, social media giants have considered or implemented stricter moderation policies over Russian disinformation efforts. Meta, the parent company of Facebook, says in a blog post that it has taken down a network run by users in Russia and Ukraine and is targeting the latter. Meta Head of Security Policy Nathaniel Gleicher and Director of Threat Disruption David Agranovich say the network violated its policy against "coordinated inauthentic behavior." Meta's security team says users created fake personae and claimed to be based in Kyiv - posing as news editors, a former aviation engineer and an author of a scientific publication on the science of mapping water. They claim there are similarities to a takedown in April 2020 that was connected to individuals in Russia, the disputed Donbas region in Ukraine and two now-sanctioned media organizations in Crimea.


Injecting fairness into machine-learning models

The machine-learning technique the researchers studied is known as deep metric learning, which is a broad form of representation learning. In deep metric learning, a neural network learns the similarity between objects by mapping similar photos close together and dissimilar photos far apart. During training, this neural network maps images in an “embedding space” where a similarity metric between photos corresponds to the distance between them. For example, if a deep metric learning model is being used to classify bird species, it will map photos of golden finches together in one part of the embedding space and cardinals together in another part of the embedding space. Once trained, the model can effectively measure the similarity of new images it hasn’t seen before. It would learn to cluster images of an unseen bird species close together, but farther from cardinals or golden finches within the embedding space. The similarity metrics the model learns are very robust, which is why deep metric learning is so often employed for facial recognition, Dullerud says.



Quote for the day:

"In a time of universal deceit - telling the truth is a revolutionary act." -- George Orwell

No comments:

Post a Comment