For decades, AGI has been the main goal driving AI forward. The world will change in unimaginable ways when we create AGI. Or should I say if? How close are we to creating human-level intelligent machines? Some argue that it’ll happen within decades. Many expect to see AGI within our lifetimes. And then there are the skeptics. Hubert Dreyfus, one of the leading critics, says that “computers, who have no body, no childhood and no cultural practice, could not acquire intelligence at all.” For now, it seems that research in AI isn’t even going in the right direction to achieve AGI. Yann LeCun, Geoffrey Hinton, and Yoshua Bengio, winners of the Turing Award — the Nobel Price of AI — in 2018, say we need to imbue these systems with common sense and we’re not close to that yet. They say machines need to learn without labels, as kids do, using self-supervised learning (also called unsupervised learning). That’d be the first step. However, there’s too much we don’t understand about the brain yet to try and build AGI. Some say we don’t need to create conscious machines to equal human intelligence.
In the nebula of Chinese-speaking threat actors, it is quite common to see tools and methodologies being shared. One such example of this is the infamous “DLL side-loading triad”: a legitimate executable, a malicious DLL to be sideloaded by it, and an encoded payload, generally dropped from a self-extracting archive. Initially considered to be the signature of LuckyMouse, we observed other groups starting to use similar “triads” such as HoneyMyte. While it implies that it is not possible to attribute attacks based on this technique alone, it also follows that efficient detection of such triads reveals more and more malicious activity. ... Taking a step back from the FoundCore malware family, we looked into the various victims we were able to identify to try to gather information about the infection process. In the vast majority of the incidents we discovered, it turned out that FoundCore executions were preceded by the opening of a malicious RTF documents downloaded from static.phongay[.]com. They all were generated using RoyalRoad and attempt to exploit CVE-2018-0802.
Autonomy for teams to work with their microservices is a crucial benefit of architecting cloud-native apps. It is preferred to use independent database instances to give the teams the flexibility to roll out updates, security patches, bug fixes in production without breaking other microservices. Cloud-Native app architecture takes inspiration from the famous 12-factor app methodologies. One factor, “Backing Services,” states that the Ancillary resources like the data stores, caches, message brokers should be exposed via an addressable URL. Cloud providers offer a rich assortment of managed backing services. Instead of owning and maintaining the database yourself, we recommend checking out the available database options in the cloud. ... Monolithic apps can talk with microservices if their endpoints are reachable within the infrastructure or securely using a public endpoint. Microservices and their data can either be consumed synchronously via their endpoints or asynchronously through messaging like the Event Bus. As part of modernizing techniques, we recommend the strangler pattern, which helps in incrementally migrating a legacy system.
Failure is part of every entrepreneur’s journey. When you care deeply about an idea, it can feel hard when you encounter people who don’t share or see your vision. Here are a few tips to stay the course when things aren’t going your way at first. ... Alba recruited friends at every step of the way who served as her sounding board. These people didn’t baby her and give her false hope; they asked the hard questions that exposed each and every possible weakness. Rely on trusted friends and confidantes to give you tough love, and your pitch will come off stronger to those who will have the final say. ... At first, everyone told Alba she should start with one product, then expand once that was successful. But this didn’t gel with Alba’s vision of a complete line of baby-safe products; the founder knew parents who wanted clean products wanted a brand that could provide multiple solutions. Ultimately, Alba ignored the conventional advice and launched with 17 products, which many people believed was too many. But because she didn’t compromise on that, either to venture capitalists or herself, the launch was a total success.
The sudden shift to remote working was unexpected, but it was surprisingly well implemented in most cases. After months of remote working, let’s look at the progress being made by remote development teams. A recently published report on 50 remote agile development teams showed mixed results: 92% of teams are writing more code by an average 10%, which sounds good. Unfortunately, 63% of teams are releasing less frequently, with the total number of releases down by a worrying 21%; On top of this, the average release size is up by 64%, increasing risk and time to value. So before the COVID-19 pandemic, we had frequent, small releases and were very agile. Now we have infrequent, high-risk, large releases. This is not the ideal situation for agile, newly remote teams. ... First, review your remote team situation. Because we have lost the benefits of colocation, where constant interaction, easy pairing and water cooler conversations aid teamwork, we need to address collaboration in other ways. ... Remote working is a skill that requires time and effort to develop. Video conferencing is a great way to engage with your team.
Microsoft investigated liquid immersion as a cooling solution for high-performance computing applications such as AI. Among other things, the investigation revealed that two-phase immersion cooling reduced power consumption for any given server by 5% to 15%. The findings motivated the Microsoft team to work with Wiwynn, a datacenter IT system manufacturer and designer, to develop a two-phase immersion cooling solution. The first solution is now running at Microsoft’s datacenter in Quincy. That couch-shaped tank is filled with an engineered fluid from 3M. 3M’s liquid cooling fluids have dielectric properties that make them effective insulators, allowing the servers to operate normally while fully immersed in the fluid. This shift to two-phase liquid immersion cooling enables increased flexibility for the efficient management of cloud resources, according to Marcus Fontoura, a technical fellow and corporate vice president at Microsoft who is the chief architect of Azure compute. For example, software that manages cloud resources can allocate sudden spikes in datacenter compute demand to the servers in the liquid cooled tanks.
When asked to identify intended users for their AI tools and technologies, over half of respondents identified clinicians as target users with healthcare providers as a close second. This is a big leap from AI being used primarily by data scientists and IT professionals, as was common in years past. This trickle-down effect of users persists even further when you consider the customers of mature organizations’ AI tools. ... As advances and applications of AI technologies grow, so do their intended user bases, so it’s important for all organizations to consider who they’re tailoring usability to. A patient who is interacting with a chatbot to schedule an appointment is a lot different than a radiologist using NLP to analyze the results of an X-Ray—and those are considerations that need to be evaluated when imagining the user experience. All organizations should be taking this into account, whether they’ve been deploying solutions for years now or are just getting started. As AI becomes more commercialized, newer players will take the lead from more mature companies that have had to evolve their customer base over the years.
Employees in Ireland are already protected by a number of labor laws. For example, they are not allowed to work more than 48 hours per week on average, except in very limited circumstances. The right to disconnect established in the new code, however, does not constitute a legal obligation: although the code's recommendations will be admitted as evidence in a court proceeding, failure to abide by the rules will not constitute an offence. Rather, the code of practice should be seen as a guide for both employers and employees, to come up together with appropriate working arrangements. This does not mean that all employees should start inflexibly working a nine-to-five schedule. The code of practice encourages employers to develop a "Right to Disconnect Policy" that informs workers of the normal working hours that will be reasonably expected from them, but also makes room for the occasional emergency that requires contacting staff outside of their workday, for example to fill in at short notice for a sick colleague. Any new policy should also acknowledge that some roles come with unconventional hours, such as those working across different time zones or requiring international travel.
The move towards greater use of the cloud has followed growing concerns on the management and protection of data. Cyber threats are continuing to evolve and accelerate, and the skills required to defend against are becoming more complex. Regulations such as the GDPR bring additional rights and safeguards for individuals, but the move towards cloud IT could expose a compliance gap – especially for organisations that handle personal data. Organisations that host their data on-premise in local storage systems should be in a position to identify the location of most, hopefully all, of their data, quite quickly and those that host data elsewhere could have concerns over not knowing where the data is stored. However, one of the challenges with public cloud adoption are the skills required to build and maintain it. Do organisations have the skills to ensure that data that is stored on-premise is secure and compliant? For many organisations, meeting compliance and regulatory requirements can be easier to achieve using private clouds. Just because organisations have outsourced their data storage, it doesn’t mean they can outsource responsibility for compliance, however.
In cities like Chicago, the citizens of crime-ravaged communities fear the criminals more than they trust the police. The relationships between these communities and law enforcement are so strained that citizens do not provide evidence or testimony that will be used to successfully prosecute the criminals and guarantee deterrence. The same outcome, born of different history, creates a lack of coordination between law enforcement and private organizations being targeted by cybercriminals. The logs and data in systems owned and maintained by these organizations contain critical information that would enable successful prosecution of cybercrime to become the norm, which would deliver deterrence. Building SecOps on the incorrect outcomes of service and data availability have left the craft unprepared to align with law enforcement outcomes. The tools, workflows, and data provide little value to investigators and prosecutors. When an organization does report a crime to law enforcement, the responding agency must comb through a mess of disparate data locations and formats that is more complicated to process than a murder crime scene.
Quote for the day:
"Even the most honest human in authority often does not have the power to undo the damages that bad people do" -- Auliq Ice