The expression of the BIAN model in ArchiMate has been a joint effort by BIAN and The Open Group, the stewards of the ArchiMate standard. The full details of this mapping can be found in the document “ArchiMate® Modeling Notation for the Financial Industry Reference Model: Banking Industry Architecture Network (BIAN)” published by The Open Group. To explain the use of BIAN in the ArchiMate language, The Open Group has published a case study whitepaper co-authored by one of us (Patrick), which uses the fictitious but realistic Archi Banking Group as an example. In this blog, we want to give you an impression of what this is about, picking and choosing some of the juiciest bits. For the full case study, please refer to the whitepaper. Archi Banking Group is the result of the acquisition of several banks in different countries, as most international banks are nowadays. This has come with the typical challenges of integration and cost control. In particular its fragmented information is becoming a compliance risk and the challenges of ‘open banking’ (e.g. PSD2) are difficult to meet.
The reason why minimizing blame is the number one priority for QA engineers is that in the QA realm, there is a general acceptance that bugs are always going to make it to production, no matter what. This is something we expect because a 100% guaranteed bug-free product would take years to ship rather than weeks, and would therefore be economically unviable. Since they know there will be problems to deal with no matter what they do, they want to show that they did everything in their power to prevent those problems. Naturally, they want to write as many tests as possible to minimize the risk of bugs that they should have caught. But since it’s impossible to write an infinite amount of tests, they have to prioritize what to test for. A QA team is given no data by which to prioritize what to test, so this prioritization is essentially a guessing game. It may be an educated guessing game based on experience and expertise, but it’s still predicting what users are most likely to do on an application without objective data as to what they really care about and how they really will use the application.
As reported by Venture Beat, Microsoft has promised AI-enhanced innovations which will be able to suppress background noise – in real time – so your call can continue smoothly. Instead of merely reducing the impact that an air conditioning unit has on the call, Teams will aim to suppress other noises not normally covered, such as doors slamming, over-excited typing on a computer keyboard or my beloved pooch having an inconvenient moment. The keyboard is a case in point. If you’re taking notes during an interview, you ideally don’t want that clickety-clack noise to intrude on the conversation. It’s those noises which aren’t “stationary” as Microsoft says, that are hard to suppress without AI. It takes hundreds of hours of data to work out what’s desirable and what’s not, using audio books to represent voices and then other sources to create those pesky noises. All of which leads to the creation of neural network to start the AI working on the data to sort out what should be heard and what shouldn’t. The power of the cloud can be leveraged to help, providing fast, real-time analysis of what’s going on and deciding what should be heard by the person at the other end of the call and what shouldn’t.
The system was not perfect. Among its mistakes, “Those musicians harmonise marvellously” was decoded as “The spinach was a famous singer”, and “A roll of wire lay near the wall” became “Will robin wear a yellow lily”. However, the team found the accuracy of the new system was far higher than previous approaches. While accuracy varied from person to person, for one participant just 3% of each sentence on average needed correcting – higher than the word error rate of 5% for professional human transcribers. But, the team stress, unlike the latter, the algorithm only handles a small number of sentences. “If you try to go outside the [50 sentences used] the decoding gets much worse,” said Makin, adding that the system is likely relying on a combination of learning particular sentences, identifying words from brain activity, and recognising general patterns in English. The team also found that training the algorithm on one participant’s data meant less training data was needed from the final user – something that could make training less onerous for patients.
Despite its age, COBOL is reliable and is still widely used -- there's an estimated 220 billion lines of COBOL still in use today. IBM, one of the founding organizations behind COBOL, continues to offer mainframes compatible with the language. The issue with COBOL now is that there are few programmers left with the skills to maintain legacy COBOL applications. Specifically, state agencies are struggling to find actively working COBOL engineers who can update their unemployment benefit systems to factor in new parameters for unemployment eligibility. To address this skills gap, IBM and Linux Foundation's Open Mainframe Project have launched a new program to help connect states with programmers who have COBOL language skills that are proving key in the push to manage the surging number of unemployment claims nationwide. ... "We've seen customers need to scale their systems to handle the increase in demand and IBM has been actively working with clients to manage those applications," said Meredith Stowell, VP of IBM Z Ecosystem. "There are also some states that are in need of additional programming skills to make changes to COBOL.
Blockchain interoperability is often viewed as a technical challenge, but there’s a lot more to it than that. The WEF divides into the Business, the Platform, and the Infrastructure. The business aspect encompasses the governance of the blockchain and trust between the two networks, as well as data standardization. To share data, it has to be standardized. But often this homogeneity is focused within a single network as opposed to across networks. Other business aspects include incentives and the legal framework, which can be a bigger challenge across jurisdictions. The platform refers to the blockchain protocol, consensus mechanism, smart contract languages, and how users are authorized and permissioned. And the infrastructure looks at the hosting of servers in hybrid clouds, managed blockchains, and whether there are potentially proprietary components that might hinder interoperability. Different projects that implement interoperability are explored, mostly for public blockchains, include the well-known projects Cosmos and Polkadot. For enterprise blockchain, the WEF referred to Hyperledger Quilt, the open source implementation of Ripple’s Interledger, as well as the Corda Settler.
“Bad actors are using these difficult times to exploit and take advantage of the public and business,” Bryan Ware, CISA’s assistant director for cybersecurity, said in a statement. The agencies warned that hackers were also exploiting growing demand for work-from-home solutions by passing off their malicious tools as remote collaboration software produced by Zoom and Microsoft. Hackers are also targeting the virtual private networks that are allowing an increasing number of employees to connect to their offices, the agencies said. ... “Crowdsourced security platforms are built to simultaneously enable a remote workforce and help organizations maximize their security resources while benefiting from the intelligence and insights of a ‘crowd’ of security researchers,” Bugcrowd CEO Ashish Gupta told VentureBeat. “In the current environment, a lot of companies don’t have the required resources to secure and test remote environments where the majority of business is now taking place.”
Edge intelligence allows a high level of data to be processed and analyzed, and for decisions to be made locally, without being sent to the cloud. Take for example a self-navigating drone, instead of relying on a service hosted on the cloud to tell the drone where to go next, the drone itself is now able to decide its own path in the field, even when connections to cloud hosted services are not reliable. ... For architects and program leads working on such initiatives within the company, it’s mainly a mindset change in regards to how the solution is designed, including capabilities of the devices on the edge and where the decision-making step in a process happens. Feasibility for scenarios such as the drone automatically calculating its own path instead of relying on a cloud-hosted service are now better than before, and a few demos or proof-of -concept attempts could now move many of these stories from the backlog and bring implementation dates forward. While AIoT in its re-imagined, converged form may be new, the two original fields (AI and IoT) that merged to create AIoT are both mature and well into mainstream adoption.
To companies providing cybersecurity solutions, the polled executives advised to avoid sales pitches that involve fear-mongering, to dial down cold calls and emails, and to concentrate on nurturing existing relationships. “Messaging ought to be geared towards impacting an enterprise’s bottom line or community, rather than attempting to fearmonger or stoke panic over a situation already causing CISOs enough anxiety,” YL Ventures explained. “Cybersecurity executives feel quite unanimously about the marketing frenzy and, according to our sources, are compiling a ‘black list’ of vendors guilty of using this tactic.” Companies should concentrate on discovering what they can do to help their existing customers and discussing their customers’ experiences. Not only will this improve customer relations, but also provide helpful information that can inform the vendor’s future plans. Last but not least, vendors should consider making goodwill gestures. “Profiteering off of a world-wide tragedy will do vendors little service in the eyes of prospective customers. 41% of the CISOs we consulted with praised technology companies using their services to help other businesses and advised entrepreneurs to follow in their lead instead,” YL Ventures noted.
The first and most important reason that architecture should not be IT-centric is the same reason why more and more IT-functions are merged with ‘business functions’. A popular metaphor was (is?) that information should be like water coming out of a faucet. In that metaphor the IT department is responsible for developing IT to deliver the information need from the ‘business’. The business aks for ‘information provisioning’, the IT department delivers. This ‘what — how’ division has been the reason for non-functioning business / IT cooperation in lots of organisations in the past decades. An enterprise in general does not need ‘information’ as such, but it needs resources and technology to execute business processes. The type of technology is not very important from a business perspective. It could be humans doing the job, mechanic or digital technology and mostly it will be a mesh of all these types of technology. As a side remark. Yes, data as a source for doing data intelligence could be seen as a product delivered by an organisational department, but that is only a small part of the totality of digital technology.
Quote for the day:
"Conviction is worthless unless it is converted into conduct." -- Thomas Carlyle