Daily Tech Digest - August 17, 2019

Security warning for software developers: You are now prime targets for phishing attacks


According to the Glasswall report, software developer is the role most targeted by hackers going after the technology sector. A key reason for this is that devs do the groundwork on building software and will often have administrator privileges across various systems. That's something attackers can exploit to move laterally around networks and gain access to their end goal. "As an attacker, if you can land on an administrator machine, they have privileged access and that's what the attackers are after. Software developers do have that privileged access to IP and that makes them interesting," Lewis Henderson, VP at Glasswall, told ZDNet. With software developers being technically-savvy people, some might argue that they shouldn't easily fall victim to phishing campaigns. But attackers can use specially-crafted messages to target one individual in the organisation they want to gain access to. With software developers often staying in jobs for relatively short periods of time, it's common for those in the profession to build a profile on professional social networks such as LinkedIn. Attackers can exploit that to find out the specific skills and interests of their would-be victim and tailor a spear-phishing email towards them.



Deploying Natural Language Processing for Product Reviews

We have data all around us and there are of two forms of data namely; tabular and text. If you have good statistical tools tabular data has a lot to convey. But it is really hard to get something out of the text, especially the natural language spoken text. So what is natural language? We, humans, have very complex language and natural language is the true form of human language which is spoken/written with sincerity also surpassing grammatical rules. To consider the best example where you can find this language is in “Reviews”. You write review mainly for two reasons, either you are very happy with the product or very disappointed with it and, with your reviews and a Machine Learning Algorithm, entities like Amazon can figure out whether the product they are selling is good or bad. Depending upon the results on the analysis of the reviews they can make further decisions on that product for their betterment.


Scrum is not magic and will not solve this problem. If you do not have enough skills to do the work or do a great job in the work, then it will not magically create those skills. What it will do is make that problem very evident in the Increment (stuff that is delivered), the Sprint Review, the Retrospective, Sprint Planning and the Daily Scrum. Actually, it will be evident in all of the Scrum events. Scrum might not be magic, but it does make problems very evident, encouraging the team to solve them. Skills are one set of challenges that teams face and Scrum will make them, or the lack of them very apparent to everyone. This will, however mean choices need to be made by the team and the management of the environment the team works within. There is no blaming the system with Scrum. Many teams doing Scrum describe the sensation of being on a Scrum Team like being in a startup. It is rare that a startup has all the right skills to deliver the best product, but they have enough to do something and will beg, borrow and steal the knowledge and experience to fill in the gaps.



Fintech - Regtech - How About Sales?

The good news is that compelling events such as a growing demand for regulatory compliance and digitalization are triggering and driving many new procurement initiatives within the financial institutions. The bad news is that purchasers, influencers and decision makers get overloaded with requests for meetings and presentations by numerous candidate suppliers. The apparent conflict between the interests of young technology companies and the overloaded and stressed end-user prospects and clients, resulted in the emergence of a new type of business: the technology brokerage or in other words: companies providing shared expert sales and account management services, on an international scale. With this new model, working with the rare species of expert financial technology sales becomes affordable for the technology company. At the same time the end-users can interact with a trusted but independent account manager that interfaces with different technology providers.


The history of AR and VR: from gimmick to business problem solver

The history of AR and VR: from gimmick to business problem solver image
The history of AR and VR goes back longer than anyone would have expected. When Charles Wheatstone invented the stereoscope in 1838, he didn’t know it, but his 3D image creation would spark the augmented reality and virtual reality boom that is predicted to infiltrate business and society in the next 10-15 years. While the first VR head-mounted display (HMD) was created in 1968 by computer scientist Ivan Sutherland, “there was no name for AR when we started in 2011,” says Beck Besecker, CEO, Marxent. “We called it hologram technology at the time.” ... Both technologies were viewed as quite gimmicky add ons, until opportunities emerged to apply them to tangible use cases, such as in the home vertical. But what changed? Did the technologies advance enough to add value? Or, did awareness around the benefits of the technologies improve? There’s a bunch of reasons. And, one of the main ones, is getting over the hype — the stumbling block for many emerging technologies.
Get ready for the convergence of IT and OT networking and security
Traditionally, IT and OT have had very separate roles in an organization. IT is typically tasked with moving data between computers and humans, whereas OT is tasked with moving data between “things,” such as sensors, actuators, smart machines, and other devices to enhance manufacturing and industrial processes. Not only were the roles for IT and OT completely separate, but their technologies and networks were, too. That’s changing, however, as companies want to collect telemetry data from the OT side to drive analytics and business processes on the IT side. The lines between the two sides are blurring, and this has big implications for IT networking and security teams. “This convergence of IT and OT systems is absolutely on the increase, and it's especially affecting the industries that are in the business of producing things, whatever those things happen to be,” according to Jeff Hussey, CEO of Tempered Networks, which is working to help bridge the gap between the two. “There are devices on the OT side that are increasingly networked but without any security to those networks. Their operators historically relied on an air gap between the networks of devices, but those gaps no longer exist. ..."



The true value of diversity in risk management


Looking beyond gender diversity, Molyneux, Omero, Reis, A. Merzouk, and Lani Bannach, Director of Essenta and Well U Trading, advocate for diverse teams but in a multidisciplinary way. Molyneux believes that “diversity, in all forms, is incredibly important for every business or sector. When I say “all forms”, I would even include things like cultural diversity, diversity in the level of experience, and even diversity in operating styles.” “There are several studies where a diverse workforce is proven to enrich the working environment by providing different solutions to the same problem and by opening up constructive debate, ultimately resulting in a better outcome. Companies that do not diversify lose out on competitiveness and talent”, Omero explained. “If the sector doesn’t value and embrace diversity appropriately it will lose a powerful taskforce and source of knowledge and creativity”, Reis added. “The sector is always open to new ideas and innovative solutions for old and new issues. The more diverse an environment is, the more creative and revolutionary will the business solutions be.”


Testing Microservices: Overview of 12 Useful Techniques - Part 1

Choose your testing techniques with a perspective on time to market, cost, and risk. When testing monoliths with techniques like service virtualization, you do not have to test everything together. You can instead divide and conquer, and test individual modules or coherent groups of components. You create safe and isolated environments for developers to test their work. ... When working with microservices, you have more options because microservices are deployed typically in environments that use containers like Docker. In microservice architectures, your teams are likely to use a wider variety of testing techniques. Also, since microservices communicate more over the wire, you need to test the impact of network connections more thoroughly. Using tools and techniques that better fit the new architecture can allow for faster time to market, less cost, and less risk.  Many IT departments work with or maintain systems developed and deployed in a monolithic architecture.


Flip the ratio: Taking IT from bottleneck to battle ready


One of the main reasons back-end systems demand so many resources is that they do not take advantage of agile ways of working that have become second nature to most software developers. Either back-end teams confuse “doing” agile rather than actually “being” agile, running waterfall projects using the scrum method but not working in small teams rapidly iterating on small chunks of code, or agile doesn’t even make it to the back-end teams. Even application maintenance and IT infrastructure can benefit from agile principles, which is significant, since these areas often make up 40 to 60 percent of the IT organization. By introducing true agile methods—small, cross-functional teams or squads working in rapid iterations—to relevant enterprise IT work, companies can radically reduce the resources needed to support those systems while substantially improving service quality and the potential for automation. ... By better understanding business needs, teams eliminated some demand by providing self-service options. Cross-functional teams had the people needed to not only identify the root cause of incidents but correct them immediately.


IoT Devices — Why Risk Assessment is Critical to Cybersecurity

IoT Devices cybersecurity risk assessment
Managing risk of any kind, and IoT risk, in particular, is never a one-and-done exercise. After first determining the risk category for new IoT devices or services, it is crucial to revisit this exercise on a regular basis. Changes to the IoT devices, the local area networks and the applications with which the devices interact create an ever-changing attack surface that requires constant monitoring to help maintain a strong forward-leaning security posture. Organizations should take a disciplined approach to risk categorization and mitigation across the entire IoT ecosystem. Tripwire can help you identify IoT risks by providing rigorous security assessments. Tripwire’s device testing approach includes identifying security risks and vulnerabilities that may exist in the physical construction of the device and its network interfaces. Our goal is to identify potential control exposures through security configuration analysis and vulnerability testing of the platform and the operating environment.



Quote for the day:


"There is no "one" way to be a perfect leader, but there are a million ways to be a good one." -- Mark W. Boyer


Daily Tech Digest - August 13, 2019

What is instant recovery? A way to quickly restore lost files and test backup systems

CSO > Microsoft Azure backups / cloud computing / binary code / data transfer
The first challenge is that the hypervisor is not really reading a VMDK image; it is reading a virtual image being presented to it by the backup product. Depending on which product you're using and which version of the backup you chose, the backup system may have to do quite a bit of work to present this virtual image. This is why most backup systems recommend limiting the number of instant booted images at a time if performance is important. The second reason instant recovery is not typically high-performance is that the VMDK is on secondary storage. In a world where many primary systems have gone to all-flash arrays, today's backup systems still use SATA, which is much slower. The final enemy of high-performance in an instant-recovery system is that many backups are stored in a deduplicated format. Presenting the deduplicated files as a full image takes quite a bit of processing power and again takes away from the performance of the system. Some deduplication systems can store the most recent copy in an un-deduplicated fashion making them much faster for an instant-recovery set up.



Pair Programming (PP) is an extreme programming approach to produce better software where two people work together at one computer and work is reviewed as it is done. The driver operates the keyboard while navigator is watching, asking questions, guiding, reviewing, learning and making suggestions. Find more about PP at Wikipedia. We often hear that Pair Programming is a “waste of time”, “doesn’t really work”, “suppresses creativity”, “kills privacy”, “stressful”, etc., These are all genuine concerns any team may have based on their circumstances and experience. ...PP helps in transitioning the knowledge and works great when you have new members on the team. Navigator plays a contributor role while the driver is the receiver. This approach indirectly reduces the training cost of the new members. Team members with heavy knowledge of the project tend to have more dependency, as they are knowledge-towers. It is always a good idea to spread that knowledge to others to reduce the dependency of those people. When these heavy-lifters pair with others, it helps to spread the knowledge easily.


8 features all enterprise SaaS applications must have


Reliability and security are two of the most important qualities for SaaS tools. Companies that run their software on premises are able to store corporate information in their own infrastructures, which helps them keep that sensitive data secure. However, when it comes to SaaS, the software providers are responsible for keeping user data safe. Consequently, it makes sense that security and data privacy are key capabilities in enterprise SaaS applications. Providers should also include features in their enterprise SaaS offerings that solve business issues and provide the availability and efficiency that are necessary in an increasingly challenging enterprise environment. There is little doubt that companies are looking into SaaS -- usually, in a multi-tenant model in which users from different organizations share the same instance of an application. SaaS is arguably the purest form of the cloud and the largest segment of the cloud market, with revenue expected to grow 22.2% to reach $73.6 billion this year, according to Gartner. In addition, SaaS is expected to reach 45% of total application software spending by 2021.


What Microsoft's upcoming 'outsourcing' licensing changes could mean

Microsoft's upcoming licensing change is going to be "massive" for customers who've been using AWS and Google Cloud dedicated hosts to run Windows Server and Windows client, says Directions on Microsoft's Miller. "Why? Those products never offered -- and still don't offer -- License Mobility through Software Assurance," he said.  Microsoft officials note that beginning October 1 "on-premises licenses purchased without Software Assurance and mobility rights cannot be deployed with dedicated hosted cloud services offered by the following public cloud providers: Microsoft, Alibaba, Amazon, and Google. They will be referred to as 'Listed Providers.'" On October 1, customers who already are running Microsoft on-premises software offerings from these listed providers will be able to continue to deploy and use Microsoft enterprise software under their existing licenses. But they won't be able to add workloads or upgrade to a new product version released after October 1 under their existing licenses.


How to implement edge computing

edge-computing.jpg
"Networking skills are important at the edge because you need highly skilled people who can make the decisions, such as whether they want to deploy one large network or a series of smaller, specialized networks," said Coufal. "These same network architects need to make decisions about which of their different networks under management should be federated with each other for information exchange and which they want to keep separate. In many cases, business security and information exchange requirements will dictate this." Coufal recommends that organizations take a measured approach when it comes to deploying computing at the edges of their enterprises. "This means pushing out portions of applications to the edges of your company, but not necessarily everything," he said. "You can always plan to scale out later." It's also important to place an emphasis on the security that will be needed at the edge, given that end user personnel, not necessarily IT, will be running and maintaining much of this edge computing. Finally, bandwidth is an issue. If you can place subsets of your data and your applications at the edge, the processing of data, as well as the data that is transmitted from point to point, will be faster.


A New Credential for Healthcare Security Leaders

The Certified Healthcare Information Security Leader - or CHISL - credential was created by the Association of Executives in Healthcare Information Security, a subgroup of the College of Healthcare Information Management Executives. "There are a number of security certification programs, but they are not tailored to the healthcare environment," Marsh says in an interview with Information Security Media Group. The new certification is "sculpted" for healthcare security leaders, he says. In its statement about the new credential, CHIME notes that it's modeled after the organization's Certified Healthcare CIO, or CHCIO, certification program, which is exclusively for healthcare CIOs. To earn the CHISL designation, a security executive will need to pass an exam that tests knowledge of seven domains: organizational vision and strategy; technology proficiency; change management; value assessment and management; service management; talent management; and management of security relationships.


7 trends impacting commercial and industrial IoT data

IoT-and-Computer-Networking.png
According to Gartner, within the next four years, 75% of enterprise-generated data will be processed at the edge (versus the cloud), up from <10% today. The move to the edge will be driven not only by the vast increase in data, but also the need for higher fidelity analysis, lower latency requirements, security issues and huge cost advantages. While the cloud is a good place to store data and train machine learning models, it cannot deliver high fidelity real-time streaming data analysis. In contrast, edge technology can analyze all raw data and deliver the highest-fidelity analytics, and increase the likelihood of detecting anomalies, enabling immediate reaction. A test of success will be the amount of “power” or compute capability that can be achieved in the smallest footprint possible. ... The CEP function should enable real-time, actionable analytics onsite at the industrial edge, with a user experience optimized for fast remediation by operational technology (OT) personnel. It also prepares the data for optimal ML/AI performance, generating the highest quality predictive insights to drive asset performance and process improvements.


3gpp-network-slicing-architecture-image03.jpg
"Think of 5G and network slicing. That's a can of worms!" remarked Dr. Gerhard P. Fettweis, coordinator of Germany's 5G Lab and a professor at Technische Universität Dresden. "How are you going to handle all this from an integrity, privacy, security [standpoint], knowing that your hardware is not going to be fail-proof -- because two years from now, we're going to have four major updates of the system, because we found out somebody could've been malfunctioning the system?" It isn't that AT&T, Verizon, and the successor company to the T-Mobile and Sprint merger have some suppressed, nascent desire to go into competition against Amazon, Microsoft Azure, and Google Cloud. But they may be reselling cloud capacity to companies large and small that could certainly disrupt the cloud providers' best-laid plans. These would include many of the cloud providers' largest enterprise customers, who may be willing to spend premiums on operating their own global, fiber optic cable-linked networks as though they were their own data centers.


Psychometric tests are a key weapon in battle against cyber security breaches

Cyberchology: psychometric tests are a key weapon in battle against cyber security breaches image
Phishing attacks are less likely to be effective if they are targeted at people with a preference for sensing. On the other hand, people with these personalities are more likely to take cyber security risks. There is a nuance here. It turns out that the cyber security risk takers are more likely to be people in this group who have a “preference for Perceiving and/or Extraversion. As for people who have a preference for feeling or judging, they “are more likely to fall victim to social engineering attacks than those with a preference for Thinking. But they also. tend to be more cautious and therefore more rigorous when following cyber security policies. However, the ‘Thinking’ group can over-estimate their own competence, leading to mistakes. The ESET and The Myers-Briggs Company Cyberchology report suggests that psychometric tests can be used to build self-awareness, thereby reducing vulnerability to potential cyber security breaches.


Empathy is a Technical Skill

Archeology and anthropology can give us good metaphors for what it’s like to work with software that we didn’t write ourselves. If you’re attempting to reconstruct someone else’s viewpoint, but you don’t have direct access to them, you’ll need to rely on two critical components: artifacts and context. The same applies to software. In a legacy system, we often don’t have access to the developers who initially wrote the code. So instead, we need to look at what they’ve left behind — their artifacts. Just like how pottery, skeletons, coins, foundations of buildings, and writing can help us figure out what someone’s life was like in the distant past, we can use those principles in software, too. The question to ask as you’re going about your daily work is, "Am I leaving durable evidence of my thinking that will help someone in the future?" That might be someone else after you’ve left for another role, or it could be your future self six months from now after you’ve forgotten the details of what you were working on.



Quote for the day:


"A simple but powerful rule: always give people more than what they expect to get." -- Nelson Boswel


Daily Tech Digest - August 12, 2019

Can an AI system invent? Does the tech have the intellectual right?

Can an AI system invent? image
There is presently a consensus inherent in patent law globally that the owner of a patent is the inventor unless the rights have been assigned to another person, entity, or their employer. However, the law also requires that the inventor must be a person who has contributed in some material way to the invention’s conception. Therefore, under current law, only a human is capable of being named as inventor and the AI system is a tool they have utilised to facilitate their innovation. The academics and inventors involved in the Artificial Inventor Project believe that this stance is outdated, and that such AI systems should be named as inventors with the owner of the machine being named as the owner of the patent. If indeed, AI systems such as The Creativity Machine seem to be capable of ‘inventing’, without any form of human intervention this could lead to patents without ‘inventors’. Some innovators may be concerned that the current lack of clarity regarding the patentability of AI-based inventions could become a barrier to progress. 


For Invisible Border Control, Start with Old-School Security Protocols

To minimize the risk of data breaches, the application layer is the only layer of technology within a computer that should be permitted to encrypt and decrypt sensitive information. So then, a second main point for implementers of border control security is that they should encrypt sensitive data within the application to ensure confidentiality. The encryption should be supplemented by secure key-management techniques using dedicated cryptographic hardware such as the Trusted Platform Module – a low-cost, high-security chip designed over a decade ago. Lack of such basic security controls led to breaches at thousands of companies over the last 15 years, including the U.S. Office of Personnel Management, Uber and Marriott. It would also be wise to add integrity controls to transactions through the use of digital signatures, given the fact that completely new systems are being created to support invisible boundaries. Not only are such transactions independently verifiable without the use of blockchain, but subtle, yet sophisticated attacks are possible when such security is not in place.


Democratic Presidential nominees are ignoring the issue of our cybersecurity infrastructure

securityhall
What is, in effect, another sort of breach, is the collection, aggregation and manipulation of our privacy by digital aggregators such as Google and Facebook, which is then further manipulated and stolen by criminals. How do we solve these problems? Blatantly dictating solutions would inevitably fail. What we can do successfully is set standards of performance and responsibility, coupled with timelines and severe penalties for failure to perform. There must be accountability –something that sometimes exists in industry (albeit at inadequate levels), but that is wholly missing in government at all levels. While I care deeply about cybersecurity, I am not naïve about the extreme pressure confronting politicians to score well in polls – a requirement to have a shot at winning their party’s presidential nomination. Arguably, cybersecurity awareness may not fit this bill. If enhanced cybersecurity is to be injected into the Democratic election agenda, the public must actively promulgate such a step. Supporting an outcry is the irrefutable fact that the signs of risk are flagrant.


Modern-Day SOCs: People, Process & Technology

Part of building a SOC also requires organizations to decide whether it will be an internal, external, or hybrid. Each has its pros and cons. The upsides to an internal SOC include the assurance that comes with it being staffed by employees who are familiar with the organization's infrastructure and understand its security posture. That said, making an internal SOC successful comes at a cost.  A more cost-friendly route could be contracting an external party to deliver SOC services, according to Durbin. "An external SOC has the advantage of minimal initial outlay costs and reduced running costs due to the economies of scale associated with outsourcing," he says. "However, it is also important for organizations to recognize that they retain responsibility for the SOC and therefore need to keep SOC governance in-house." Members of ISF have expressed to Durbin that a hybrid SOC offers "the best of both worlds" by addressing some of the limitations that can encumber the performance of an internal or external SOC, he says.


Ransomware attacks are getting more ambitious as crooks target shared files


Despite a rise in ransomware attacks against cloud and network services – which in some cases see attackers make off with hundreds of thousands of dollars – organizations can prevent themselves from becoming the next victim. "It is hard to stop, but it can be defeated. There are many precursor signs to a ransomware attack that can be detected and responded to, before a ransomware attack succeeds," said Morales. "Continuous monitoring for network behaviors to proactively detect and respond to attacks does give an organization an opportunity to save themselves from the loss of data," he added. Organizations can also go a long way to avoid falling victim to a ransomware attack by ensuring that systems that don't need to be facing the open internet aren't remotely accessible, and by applying security updates to prevent malware taking advantage of vulnerabilities. Businesses should also keep regularly updated offline backups of their data, so if the worst does happen, the systems can be restored without giving into the demands of cyber criminals.


The Intel Assembly Manual

Reading this through will enable you to understand how the operating systems work, how the memory is allocated and addressed and, perhaps how to make your own OS-level drivers and applications. To help you understand what's happening, the github project includes many aspects of the article (and I 'm still adding stuff). It's a ready to be run tool which includes a Bochs binary, VMWare and VirtualBox configurations and a Visual Studio solution. The entire project is build in assembly using Flat Assembler. Assemblers like TASM or MASM will not work, for they only support specific architectures. Bochs is the best environment to experiment, because it includes a hardware GUI debugger which can help you understand the internals. Debugging without Bochs is impossible, because the debuggers are either real mode only (like MSDOS Debug) and assume you will always have some sort of control, or are able to run only in an existing environment.


Researchers find security flaws in 40 kernel drivers from 20 vendors

kernel socket driver
The common design flaws is that low-privileged applications can use legitimate driver functions to execute malicious actions in the most sensitive areas of the Windows operating system, such as the Windows kernel. "There are a number of hardware resources that are normally only accessible by privileged software such as the Windows kernel and need to be protected from malicious read/write from userspace applications," Mickey Shkatov, Principal Researcher at Eclypsium told ZDNet in an email earlier this week. "The design flaw surfaces when signed drivers provide functionality which can be misused by userspace applications to perform arbitrary read/write of these sensitive resources without any restriction or checks from Microsoft," he added. Shkatov blames the issues he discovered on bad coding practices, which don't take security into account. "This is a common software design anti-pattern where, rather than making the driver only perform specific tasks, it's written in a flexible way to just perform arbitrary actions on behalf of userspace," he told ZDNet.


A billionaire software mogul doesn't want his company to grow up

While SAP may be Plattner’s primary obsession, the software mogul has used his considerable wealth (he is the fifth-richest German with a net worth of about $15 billion) to finance his educational, philanthropic and sporting ventures. Plattner built a museum in Potsdam on the outskirts of Berlin to house his art collection, and financed the Hasso Plattner Institute in the same city, a vast IT campus that churns out software engineers. Investors have criticized SAP for being too slow to rejuvenate its executive suite, and for relying too heavily on Plattner to drive innovation. (Plattner, because he’s limited in what he’s allowed to do as chairman, also advises SAP on technology issues). In response, the company can point to some recent high-profile promotions of younger talent. One is Plattner’s protege Juergen Mueller, SAP’s 37-year-old chief technology officer. Mueller, a graduate of Plattner’s HPI, has been pushing artificial intelligence at SAP.


At A Glance – Doxxing


Doxxing is one of many threats businesses face however, it isn’t always carried out with malicious intent. Doxxers can aid the police and emergency services by uncovering the identity of criminals, reveal the true personas behind abusive or harmful content, and discourage people from engaging in illegal or socially taboo online forums. In one well known example, a Reddit user called ‘violentacrez’ fell foul of doxxing carried out by an American journalist. Worried that their true identity would be revealed, violentacrez deleted their account. It was too late. Violentacrez, the online identity used by Michael Brutsch, has been at the centre of a controversial debate over misogyny and unsavoury internet use for over 10 years. Organisations may even use doxxing for business research and analysis but this is not generally seen as an advisable or legitimate use. Doxxing does have serious implications for business as part of an ever growing cyber threat. Organisations should make it a priority to educate stakeholders and safeguard against such attacks.


6 Security Considerations for Wrangling IoT

The sheer increase in the volume of consumer IoT fostered by retail and tech giants has created a massive attack surface. Consumers may have dozens of IoT devices in their homes. And with all of their variations in software, suppliers, and connection points, the possibilities for things to go wrong seem endless. For instance, the simple task of turning on your home security system (an IoT device that communicates with a server), driving your car (your phone or car could also be an IoT device), and using a streaming camera at home seems innocuous on their own, but the data may be tracked by various parties, and combining them causes alarming possibilities of potential malicious activity. To better ensure safety and security, education is needed across the entire IoT ecosystem — from consumers to device manufacturers, service providers, third parties, and developers. Findings show the top reasons for IoT security vulnerabilities include weak passwords, insecure web APIs, cloud and mobile interfaces, insecure third parties, network services, and data transfer to name a few.



Quote for the day:


"Remember: Rewards come in action, not in discussion." -- Tony Robbins


Daily Tech Digest - August 10, 2019

Blockchain’s real promise: Automating trust

Blockchain̢۪s real promise: Automating trust
The opportunity for transformation is significant because the cost of establishing trust in a supply chain is incredibly high. Consider the problem of counterfeit goods: The Organisation for Economic Cooperation and Development estimates that $461 billion worth of fake goods are sold annually, amounting to 2.5 percent of global trade. According to the Global Brand Counterfeiting Report 2018, total global counterfeiting is expected to surge to $1.82 trillion by 2020, exposing businesses to revenue loss, quality issues, and potential reputational damage. As companies grapple with how to build trust among their suppliers, they are doling out big money on activities such as duplicative testing, manual auditing, and reconciliation, while investing in extra insurance and legal assistance to backstop any failure to meet contractual obligations. In the airline industry, for example, carriers are grounding planes longer, hoarding an excess of spare parts, and avoiding the use of less-expensive used parts and planes because they don’t fully trust their provenance.



The brain inspires a new type of artificial intelligence

Brain dynamics do not comply with a well-defined clock synchronized for all nerve cells, since the biological scheme has to cope with asynchronous inputs, as physical reality develops. "When looking ahead one immediately observes a frame with multiple objects. For instance, while driving one observes cars, pedestrian crossings, and road signs, and can easily identify their temporal ordering and relative positions," said Prof. Kanter. "Biological hardware (learning rules) is designed to deal with asynchronous inputs and refine their relative information." In contrast, traditional artifical intelligence algorithms are based on synchronous inputs, hence the relative timing of different inputs constituting the same frame is typically ignored. The new study demonstrates that ultrafast learning rates are surprisingly identical for small and large networks. Hence, say the researchers, "the disadvantage of the complicated brain's learning scheme is actually an advantage". Another important finding is that learning can occur without learning steps through self-adaptation according to asynchronous inputs.


Are you blinded by the promise of AI?

blind spot side view mirror car vehicle
Data is being collected by technologies across your business faster than any human can assess, analyze and leverage it. The big leap forward in data analytics has been the machine learning capabilities that result in algorithms that can forecast behavior and offer recommendations and/or potential pathways. From the recommendations that come from content streaming services to shopping options that pop up in online advertising, many of us see these algorithms at work every day. Customer data—when collected with permission, security and high integrity—offers businesses potent customer personalization opportunities, from sending discounts or information around important life events (anniversaries, holidays, etc.) to creating personalized communications. Because AI can collect and analyze large data sets at remarkably fast rates, businesses can use it to predict potential issues, favorable market opportunities or customer needs. To identify these opportunities, organizations need to work across all business groups to assess the numerous places data is collected, and how and when that data can be used


Software quality issues: not just for Boeing CIOs

It is clearly no exaggeration to describe as becoming increasingly life-critical. Software keeps planes in the sky, tests the cars we drive, keeps the health systems running our hospitals and ensures 1.5 million smart meters keep houses warm across the UK is vital. However, with software enabling some of the most important and life-critical functions, we need to know it can execute, flawlessly, again and again. In Boeing’s case, a functional issue prevented the software from performing as it was expected to. However other issues, which are coding related, not functional, are arguably harder to detect. Coding issues in software can result in poor quality and IT outages similar to Boeing. CISQ’s recent research on software quality estimated the cost of poor software quality to $2.8 Trillion for the US alone. CIOs, therefore, need the ability to oversee the current state of an organisations software. Yet, the 2018 Software Intelligence Report found only half (51%) of CIOs claim to “have ‘some’ knowledge of current applications” software quality. Even worse, less than 50% of CIOs believe their organisations have enough insight into the software to make the best decisions.


Cyberattack Warning As Dangerous Issues Found On Popular Office Printers

Office printer.
In both those disclosures, the primary risk exposed was unpatched devices providing a soft entry point into a would-be secure network. In essence, attackers don't need to try too hard to develop sophisticated TTPs when there are vulnerable IoT devices that, in many cases, are not even on the radar of corporate technology security teams. "IoT devices," Microsoft pointed out, "are purposefully designed to connect to a network and many are simply connected to the internet with little management or oversight. In most cases however, the customers’ IT operation center don’t know they exist on the network." How true is that of network printers, connecting to the open internet to download printer drivers while also appearing on internal networks? In the cyberattacks identified by Microsoft, those devices—including the office printer—became "points of ingress from which the actor established a presence on the network and continued looking for further access.


The Pros and Cons of Emotional AI

Group Of Men School Symposium Learning Studying Cybersecurity AI Cisco Pro Cons Article
Some analysts also worry that if emotional AI gauges how people feel often enough or provides responses with simulated feelings, it could give humanity an excuse not to stay connected. For example, if an AI tool could check on a loved one and send a report that says everything’s fine, a user could decide that’s enough information and not bother confirming it’s true. What if a person has a disability that causes them to have trouble controlling their facial expressions, or perpetually grimace because they’re in pain? Those things don’t have anything to do with the kind of service received. If emotional AI makes the wrong judgment, it could bring unwanted attention to the individual and cause embarrassment. It’s also possible that AI could pick up on a person’s emotions and do something that worsens how they feel. Many people have had at least a few instances where Facebook’s “On This Day” feature showed something they’d rather not recall. Some companies are developing AI that could respond to people’s angry or sad emotions to cheer them up or calm them down.


New wave of smart cities has arrived -- and they’re nothing like science fiction

smart city
Successful smart city projects blend disciplines, bringing together experts in behavioral change alongside specialists in artificial intelligence and information technologies. Interdisciplinary work can be messy and difficult, it can take longer and may not always work -- but when it does, it can bring real benefits to cities. For instance, Nottingham City Council and Nottingham Trent University have been part of the Remourban regeneration program, working across sectors with cities around Europe. Homes in the Nottingham suburb of Sneinton have been upgraded with new outside walls and windows, a solar roof and a state-of-the-art heating system -- a process that takes just a few days. The result is improved insulation and reduced energy bills for residents, but also better public health: calculations suggest that bad housing costs the UK’s National Health Service £1.4 billion a year, and improving the quality of homes can cut visits to local doctors almost by half. The German city of Darmstadt has worked with citizens, universities, museums and businesses to plan for the future.


White House Staffer Seeking Threesomes, According to Data Leak

3Fun
An explanation of how the data was obtained was published in a detailed blog post Thursday. Writer Alex Lomas acknowledged it's possible tech-savvy web users could have manipulated the data to make it appear their locations were close to seats of power. The focus of the post wasn’t on the technology-enhanced sex habits of Supreme Court law clerks, but on the continued issues with leaked data and hookup apps. That includes the ready release of personal information allowing users of Grindr and Romeo to be tracked down from great distances. “We think it is utterly unacceptable for app makers to leak the precise location of their customers in this fashion,” Lomas wrote. “It leaves their users at risk from stalkers, exes, criminals, and nation states.” While the differentiating features for such apps involve the ability to locate other nearby users, developers have faced calls to address flaws in the technology that allows people to access private data and to find the precise location of users from significant distances and then target them.


State Farm Investigates Credential-Stuffing Attack

State Farm Investigates Credential-Stuffing Attack
"State Farm discovered a bad actor or actors attempting to gain access to customers' online accounts using a list of user IDs and passwords from other sources," the company spokesperson tells ISMG. "To defend against the attack, we reset passwords for these online accounts in an effort to prevent additional attempts by the bad actor. We have implemented additional controls and continue to evaluate our information security efforts to mitigate future attacks." It's not clear how many customers were affected by the incident, and the State Farm spokesperson did not specify how many notification letters went out. "We encourage customers to regularly change their passwords to a new and unique password, use multifactor authentication whenever possible and review all personal accounts for signs of unusual activity," the spokesperson says. Credential stuffing has emerged as one of the biggest threats to enterprises across the world. A 2018 report by security vendor Akamai found that companies were reporting nearly 13 credential stuffing incidents each month in which the attacker successfully identified valid credentials.


Blockchain’s Role In Collaborative Competition

Blockchain’s immutable record avoids legal disputes by clearly attributing IP. Through self-regulating and decentralised smart contracts, organisations can collaborate with far more confidence. Rather than trusting a competing company, organisations place their trust in technology. Let’s say that one of the manufacturers in the IVC’s data sharing strategy opens up the information they have about the production and performance of a specific tool. If another company wants to access the information, they can, but may need to offer IP or capital in exchange. Once the exchange is made, each party gets something out of it which ultimately benefits the manufacturing industry as a whole. Blockchain is one of many disruptive technologies that has made it easier for companies to stay competitive through collaboration. The IVC’s encouragement of blockchain adoption demonstrates just how much capitalist companies have changed. Instead of locking IP away forever, organisations recognise that data exchange is advisable, if not necessary.



Quote for the day:


"Leaders are the ones who keep faith with the past, keep step with the present, and keep the promise to posterity." -- Harold J. Seymour


Daily Tech Digest - August 09, 2019

Supercomputer-Powered AI Tackles a Key Fusion Energy Challenge


The disruptions – which happen near-instantly – need to be detected as early as possible. So far, simulations have been unable to deliver fast enough predictions – so the researchers turned to machine learning, which has shown promising results for disruption prediction. The goal: to meet the 95 percent correct disruption prediction threshold required by the under-construction ITER Tokamak, which will be the larger fusion reactor in the world. Julian Kates-Harbeck (lead author on the paper published in Nature) answered this challenge by developing the Fusion Recurrent Neural Network (FRNN), an AI disruption prediction tool. FRNN learns from thousands of experimental runs – tracking plasma current, temperature, density and other variables – and attempts to learn which factors signal imminent disruptions. To meet the level of reliability that ITER will demand, the researchers ran FRNN on powerful machines. After initial runs on Tiger (a cluster at Princeton University), they turned to the (now-decommissioned) Titan supercomputer, where they ran FRNN on 6,000 Nvidia Tesla K20X GPUs.



This Bank Gave Bitcoin to Its Entire Staff. Now It’s Taking Crypto Clients

Quontic Bank opened a checking account for a bitcoin ATM company a few weeks ago and is in the process of completing a contract to deliver banking services to another crypto startup. The bank wouldn’t name either client. “We’re just taking steps so that when the regulatory environment becomes more crypto-friendly, we don’t have a lot of catching up to do,” said Quontic chief executive Steven Schnall, who acquired the bank in 2009. “We’re looking to diversify our product offering and our customer mix by entering into that field.” While Schnall wouldn’t say how big he wants Quontic’s crypto business to be, he claimed the pending contract “could impact millions of Americans.” Crypto-friendly banks are extremely rare, in part because of the extra work they have to do complying with know-your-customer (KYC) and anti-money laundering (AML) regulations. “Banks and other financial institutions have to look out for any suspicious activity,” said Joshua Klayman, head of the blockchain and digital assets practice at law firm Linklaters.


Seven very simple principles for designing more ethical AI


AI systems have already been designed to help or hurt humans. A group at UCSF recently built an algorithm to save lives through improved suicide prevention, while China has deployed facial recognition AIsystems to subjugate ethnic minorities and political dissenters. Therefore, it’s impossible to assign valence to AI broadly. It depends entirely on how it’s designed. To date, that’s been careless. AI blossomed with companies like Google and Facebook, which, in order to give away free stuff, had to find other ways for their AI to make money. They did this by selling ads. Advertising has long been in the business of manipulating human emotions. Big data and AI merely allowed this to be done much more effectively and insidiously than before. AI disasters, such as Facebook’s algorithms being co-opted by foreign political actors to influence elections, could and should have been predicted from this careless use of AI. They have highlighted the need for more careful design, including by AI pioneers like Stuart Russell, who now advocates that “standard model AI” should be replaced with beneficial AI.


PCI SSC warns organisations about growing threat of online skimming

Online skimming is a variation of a criminal tactic used to gain access to payment card information. Until recently, it was more commonly associated with physical fraud, in which criminals use a device (‘skimmer’) that interacts with a victim’s payment card. One of the most common skimming methods is to place a duplicate card reader on top of an ATM’s payment card slot. Criminals can then siphon off card details as the card enters the machine. This reader will typically be paired with a pinhole camera or duplicate keypad placed over the machine so that the fraudsters can log the customer’s PIN. Online skimming works in much the same way, except the ATM is replaced by an online payment form and the physical skimming device is replaced by malicious code. Magecart is the umbrella term used involving criminal groups exploiting vulnerabilities that mostly target Magento-based online stores or content management systems. A number of recent data breaches such as Ticketmaster/British Airways was believed to be part of such credit card skimming operations.


What is edge computing? Here's why the edge matters and where it's headed

190227-schneider-edge-11-inside-a-micro-dc-cabinet.jpg
Edge computing has been touted as one of the lucrative, new markets made feasible by 5G Wireless technology. For the global transition from 4G to 5G to be economically feasible for many telecommunications companies, the new generation must open up new, exploitable revenue channels. 5G requires a vast, new network of (ironically) wired, fiber optic connections to supply transmitters and base stations with instantaneous access to digital data (the backhaul). As a result, an opportunity arises for a new class of computing service providers to deploy multiple µDCs adjacent to radio access network (RAN) towers, perhaps next to, or sharing the same building with, telco base stations. These data centers could collectively offer cloud computing services to select customers at rates competitive with and features comparable to, hyperscale cloud providers such as Amazon, Microsoft Azure, and Google Cloud Platform. Ideally, perhaps after a decade or so of evolution, edge computing would bring fast services to customers as close as their nearest wireless base stations.


Get creative with feature flags in IT operations


DevOps shops can use feature flags in conjunction with other application deployment methods, including what Condo referred to as a progressive release methodology. Rather than the typical pattern of releasing approved code and then deploying that release to production, a progressive release instead deploys latent code to production, where it is then tested or held until a designated time and switched on. A feature flag, in this instance, makes that final changeover as simple as pressing a button. But feature flags also create significant technical debt if left unchecked -- technical debt that IT admins must clean up or otherwise manage during troubleshooting, new releases and other ops tasks. Organizations that rely on feature flags have two options, Condo said: "They could leave the feature flag there, because they have some greater strategy about how they manage things ... or [flag code removal] becomes part of the normal routine, and [admins] remove that code so that it never gets turned off accidentally and [there are] fewer flags to manage."


The emergence of London’s Silicon Roundabout: why? And lessons for future tech hubs!

The emergence of London̢۪s Silicon Roundabout: why? And lessons for future tech hubs! image
Technology shifts such as the iPhone, the Android with its emphasis on opensource, WordPress slashing the cost of building a website, the cloud reducing the need for startups to invest in expensive hardware — helped create a space for startups. As for London — immigration may have been another factor: As Espinal pointed out, he originally hails from Honduras, worked in the US, came to the UK from the US, as the UK visa system allowed this. “At the time, immigration regulation favoured highly skilled migrants.“ Espinal says you can unpick the tech story in terms of afters and befores — pre-SEIS post-SEIS, pre/entrepreneur relief, post-entrepreneur relief — EIS, the emergence of accepted ways of providing shareholder agreements, convertibles, cross border investing. Vidra reckons that the 2012 Olympics was a factor — the second wave, anyway. What with that and the Royal Wedding, the London brand name was strong. The Olympics illustrated another point; it was said that every nation competing in the games had 10,000 supporters from the local population.



DDoS attacks in Q2 2019

In early June, a powerful DDoS attack hit Telegram. The attack was carried out primarily from Chinese IP addresses, which gave founder Pavel Durov reason to link it to the demonstrations in Hong Kong; in his words, the political opposition there uses Telegram to organize protests, which Beijing takes a very dim view of. The only headline attack this quarter seemingly driven by commercial considerations targeted video game developer Ubisoft on June 18 — just before the release of its new Operation Phantom Sight expansion for the game Rainbow Six Siege. It caused connection problems for many players, and even provoked calls on Reddit for better DDoS protection. The largest would-be DDoS attack in Q2 turned out to be a false alarm. In late June, some segments of the Internet experienced operational issues worthy of a major DDoS offensive, but the actual cause lay elsewhere. As it turned out, a small ISP in Pennsylvania had made a configuration error, turning itself into a priority route for some Cloudflare traffic. The provider could not handle the load, and thousands of websites serviced by Cloudflare went down as a result.


Integrating Technology into Your Supply Chain: Five Questions You Need to Ask

Integrating Supply Chain Tech.jpg
Unfortunately, many companies learn the hard way that when it comes to supply chain technology, it is not “one size fits all” or “technology for technology’s sake.” Stalled projects, unrealized benefits, disrupted operations, and customer and employee frustration point to the importance of selecting the right kind of emerging technology based on your operating profile and future outlook of your business. By far, the biggest driver of disruption for companies is e-commerce and the extraordinarily high service expectations it is creating. In fact, a recent report from DHL Supply Chain, “The E-Commerce Supply Chain: Overcoming Growing Pains,” found that pressure to fulfillcustomer expectations continues to challenge businesses building out e-commerce offerings and the new supply chains they need. Customers expect a great, painless e-commerce purchase experience with an ever-shortening delivery time. We are noticing profile changes in other market verticals as well, as order sizes decrease and service expectations increase.


Celebrating the Indian CEO

British consumer goods and healthcare giant Reckitt Benckiser Group Plc recently made some waves when it named PepsiCo’s Laxman Narasimhan as its next chief executive officer, looking outside its own ranks for a new leader after a difficult few years. Interestingly, he replaces Rakesh Kapoor, another Indian who had been at the helm for eight years. Narasimhan’s appointment is the latest in a series of appointments of Indian-origin CEOs at top global firms in the last decade or so. Think – Vasant Narasimhan (Novartis), Sundar Pichai (Google), Satya Nadella (Microsoft), Shantanu Narayen (Adobe), Ajay Banga (Master Card), Ivan Menezes (Diageo), Sonia Syngal (Old Navy), Rajeev Suri (Nokia) and more recently Vivek Sankaran, President & CEO of Albertsons Companies and Nitin Paranjpe, global COO at Unilever.There’s no ignoring the trend of Indians making it to global leadership roles at multinational firms. And while this trend has largely been noticed in the tech firmament and Silicon Valley where the geek background helped in no small measure, the other industries surprisingly are not far behind.



Quote for the day:


"Limitations are what someone else tries to impose on you. Don't accept it. Question it!" -- Elizabeth McCormick


Daily Tech Digest - August 08, 2019

VR is ahead for now, but AR will be a larger market in the long run

VR is ahead for now, but AR will be a larger market in the long run image
Taking any process, whether it’s in the personal or business sphere, usually involves some sort of digital interface to accomplish it. One example would be navigating using Google Maps to an unfamiliar location; a restaurant or a bar. A business equivalent of this could be in a warehouse, where an employee needs to navigate to the right shelf to pick up or deposit an item. On the consumer side, another example includes performing an oil change on a vehicle and on the business side, an equivalent could be maintaining an elevator system. Fixing the water pressure on your boiler… the list of applications is really endless. All of these processes require a lot of knowledge, sometimes specialist and sometimes not. But, the point is that all of these tasks can be performed with the aid of augmented reality — eventually. Currently, the use of AR in the above scenarios is being held back by the price, the design of the headsets, their ease of use and various cultural hurdles; a lack of understanding, for one. With AR, everything in both a business and a social sense can become a lot more efficient and accurate.



Data and AI Power the Future of Customer Engagement in Financial Services

The beauty of digital engagement driven by data, advanced analytics and AI is that a dialogue can be created across a broad range of topics. Instead of trying to sell a limited range of products in a program-based environment, all products and services can be included in the conversation, based on identified need. Each interaction is based on the individual profile, preferences and behavior of the consumer. With the integration of chatbots, voice and live agents, consumers can provide feedback on each communication and interaction, allowing models to improve and become even more personalized over time. The learnings during the process not only makes the engagement more personal, it makes it more powerful because recommendations will be more accurate. While what’s learned will potentially increase the amount of dialogue over time, personalization and contextualization improves, as do results and revenues. More importantly, the learnings will be shared across the organization, making every contact point more intelligent and consistent. 


How to Fit Smart Home Technology into Your Business


Smart home technology comes in many forms, and some of the most popular applications include automated lights, locks, and thermostats. Now more consumers are using virtual assistants to tie all of their connected smart devices together into one cohesive smart home. Today, 27% of people in the U.S. currently own and use a virtual assistant, such as the Amazon Echo or Google Home. Consumers primarily use them to connect with and command apps without picking up their phones. However, companies can also utilize virtual assistants to schedule meetings, scan through emails, and (like consumers) get important data and information on the fly. For example, rather than ask your assistant about the weather, you could ask what last quarter’s revenue looked like compared to the quarter before it. At a 2018 data and analytics summit, Gartner analyst Svetlana Sicular pointed out that AI still needs further development before it can emulate human-to-human conversation.


Cloud adoption questions every IT admin should ask


Changing tools or platforms is akin to moving out of a house. You don't recall what's stored in the basement, nor are you prepared for how long it will take to go through every forgotten box. Cloud migration won't be a quick project, and it starts with an inventory of what's in place, if those items are needed and how -- or even if -- they will work on the new platform. While these legacy on-prem deployments are complex and often underdocumented, the IT admin can migrate or even alter components of the system as needed -- not so when a cloud provider controls the physical data center, security and other aspects of IT infrastructure. Cloud service adoption is a struggle as the business is accustomed to IT staff using their collective knowledge and authority to solve any problem. When something breaks at the cloud provider's end, IT must log the issue with the vendor and wait for resolution.


Survival skills for the digital age


People no longer need to be working from a single desk for the sake of presenteeism. Instead, they are tethered to a suite of devices and platforms that were designed by humans but are not human themselves: Intelligent technology is more like a third person in the room with a person and her device, but it should not be confused with another human. My research showed that it is common for people to spend more than 75 percent of the workday on at least three devices, often more. Financial Times journalist Hannah Kuchler, who covers Silicon Valley, told me that “overall, [people] do things dictated by tools they have, and none of the productivity tools in this new collaborative era really encourage proper thinking.” There’s the example of Slack. Answering Slack messages is no different from answering emails, and neither involves deep thinking, which is a prerequisite for coming up with innovative, value-added solutions. The office-less world requires a new set of digital survival skills, which are vital to preventing burnout.


Digital Transformation And The Cognitive Enterprise

Digital Transformation And The Cognitive Enterprise
The impact of digital transformation on talent strategies brings opportunities, as well as challenges. The digitisation of work and life is driving a need for new types of talent with new skills. Yet, despite the pressing need for digitally fluent employees to deliver innovation at an extraordinary rate and pace, talent with in-demand skills is limited. To ensure we have digitally skilled talent in place to drive organisations forward, there is a critical need for exponential learning. Leaders must also actively create an internal culture that supports the need for agility and constant change management amid ongoing disruption. HR has a vital role to play in supporting the attraction and delivery of digitally fluent talent. An urgent requirement is to offer personalised learning to employees – AI-enabled platforms can really elevate this experience. HR must prioritise current skills development and for enterprises, the skills gap is very real. Of the global executives we spoke to in our recent CHRO research, 60% say they are struggling to keep their workforce current and relevant. 


Leadership In A Digital Age Is Fundamentally Different

Communication network concept. AI(Artificial Intelligence).
The research we did showed two powerful forming elements for great leaders in successful corporations thriving with digital transformation. Firstly, the winners (sucking up 72% of the available returns) see everybody in the organizations as being mutually responsible to each other all the time. Every moment of every day. Secondly, thriving leaders recognize that customer journeys don't work in a world where customers or consumers can choose which of the thousand moments' they want to act in. This means we need to be continuously connected and aware of how each of the moments that might matter are being handled. These organizations have worked out how to do both of these things and how to deliver power to people and processes that have work in each moment. That is super scary for command and control cultures for obvious reasons. The organizations that get this right drive 40% better OPEX performance and see 25%+ changes in total revenue changes compare to industry peers.


The Challenges in Integrating Cross-Boundary Teams

When it comes to building a team, two prominent approaches exist: one is to map its sequential growth, and the other is to allow the non-sequential way they confront an issue. In the sequential growth approach, the team first interacts slowly, learns the tasks and goals, and then progresses competitively towards the goal. It involves the surfacing of conflicts, their resolution, acceptance, and then moving towards performance. On the contrary, a non-sequential team moves in a flexible way; they don't have the slow progressive pattern which we see in sequential teams. It is hence usually an association of known people; those who have worked previously or those who understand each other deeply. When assembling cross-functional teams, it is essential to understand the role of team mental models and transactive memory in enhancing team effectiveness. The team mental models make the members aware of the team environment, in a shared and organized understanding of the way key elements interact within the team atmosphere. It is the way an individual member perceives these factors.


R. Michael Anderson reveals the key to (tech) leadership

R. Michael Anderson reveals the key to (tech) leadership image
Before a person can become a successful leader, in technology or elsewhere, Anderson advocates improving the “relationship with ourselves. You can’t be a strong leader until you’re really strong with a relationship with yourself.” “People want to be led, and they want to be led by somebody that they respect. If there’s somebody that’s insecure and doesn’t respect themselves, nobody’s going to respect or have loyalty to them. “The type of people who have a leadership presence are assertive and respect themselves and have confidence in themselves, those are the ones who inspire loyalty and confidence. “Leaders need bold goals, a purpose or mission and have the guts to be able to stick with it, even when things are getting some resistance — inner resilience is key.” ... Good programmers are analytical, “sometimes introverted” and hard working. To be a successful leader these traits (bar being introverted) are alone not enough.


Amazon's PartiQL query language eyes all data sources


Amazon created PartiQL to meet internal demand for multi-source data queries across structured, semi-structured and unstructured data. Those needs came from various corners of Amazon's business, including its retail arm. Amazon released PartiQL's tutorial materials, specification and reference implementation under the Apache 2.0 license. AWS has used PartiQL internally for services such as S3 Select, Glacier Select and RedShift Spectrum, and it was adopted as a query language for Quantum Ledger Database, which AWS launched last year. PartiQL is compatible with standard SQL, which means enterprises can use existing queries with PartiQL in conjunction with SQL query processors. It also treats nested data as a first-class citizen, and doesn't require predefined schemas to be placed on a data set. PartiQL does contain SQL extensions, but they are minimal and simple for DBAs and developers to understand, AWS claims. Finally, PartiQL is data format-independent, which means one query applies to JSON, ORC, CSV and other data types.




Quote for the day:

"A leader takes people where they would never go on their own." -- Hans Finzel