Strategic Steps for Implementing Generative AI in Your Enterprise

Generative AI (GenAI) has rapidly become a focal point of technological innovation, capturing the attention of enterprises across the globe. While the majority of organisations are still exploring the potential of AI, a select few have already mastered its deployment across various business units, achieving remarkable success. According to Gartner, these AI-savvy organisations represent just 10% of those currently experimenting with AI. However, their experiences provide invaluable insights for those looking to harness GenAI’s power effectively. This blog post outlines a strategic four-step approach to help enterprises implement GenAI in a manner that is both valuable and feasible.

1. Establish Your Vision for GenAI

The foundation of any successful GenAI implementation is a clear and strategic vision. Begin by defining how GenAI will contribute to your enterprise’s overarching goals. Consider the specific benefits you expect GenAI to deliver and how these will be measured. A well-articulated vision aligns your GenAI initiatives with your enterprise’s mission, ensuring that AI efforts are purposeful and integrated into broader business strategies.

For example, if your enterprise aims to enhance customer satisfaction, GenAI can play a crucial role by enabling advanced customer behaviour analytics or deploying virtual customer assistants. By linking GenAI objectives directly to enterprise goals, you foster organisation-wide fluency and pave the way for innovation that yields measurable returns.

2. Remove Barriers to Capturing Value

Once the vision is established, it’s essential to identify and eliminate any organisational barriers that could impede the realisation of GenAI’s potential. These barriers may include regulatory challenges, reputational risks, or competency gaps. Addressing these issues early on is crucial to maximising the value of your GenAI initiatives.

Strategic concerns, such as aligning AI projects with corporate goals, should be documented and addressed through a portfolio approach to AI opportunities. Metrics that serve as proxies for financial and risk outcomes should be selected to provide credibility and guide project maturity. Establishing formal accountability structures, such as a RACI (Responsible, Accountable, Consulted, and Informed) matrix, can further bolster AI results by clarifying roles and responsibilities throughout the AI strategy development and execution process.

By proactively addressing these barriers, you not only mitigate potential risks but also ensure that your GenAI initiatives are aligned with your organisation’s broader goals, increasing the likelihood of success.

3. Assess and Mitigate Risks

Implementing GenAI introduces a unique set of risks that need to be carefully assessed and mitigated. These risks can be broadly categorised into regulatory, reputational, and competency-related concerns. Each of these carries its own set of challenges:

  • Regulatory Risks: As AI technologies evolve, so too does the regulatory landscape. It is critical to stay informed about relevant regulations and ensure that your GenAI projects comply with these requirements. Establishing a collaborative framework between AI practitioners and legal, risk, and security teams can help evaluate the feasibility of AI use cases while maintaining compliance.
  • Reputational Risks: AI systems can be vulnerable to both intentional and unintentional misuse, potentially harming your organisation’s reputation. Implementing robust security measures across your enterprise, ensuring data integrity, and continuously monitoring AI models can help safeguard against these risks.
  • Competency Risks: The rapid pace of AI innovation can create a gap between your organisation’s current technical capabilities and what is required to effectively deploy GenAI. To bridge this gap, align your AI strategy with your cloud strategy, modernise data and analytics infrastructures, and consider creating programmes that foster incremental innovation and reduce technical debt.

By systematically identifying and addressing these risks, you can protect your organisation from potential setbacks and ensure that your GenAI initiatives are both safe and effective.

4. Prioritise Adoption Based on Value and Feasibility

Not all GenAI initiatives are created equal. To maximise the impact of your AI strategy, it is crucial to prioritise projects that offer the greatest value and are most feasible to implement. Begin by evaluating each potential project against a set of criteria, such as technical feasibility, alignment with your organisation’s mission, and the availability of necessary skills and resources.

Rate each project on its feasibility and value, and use these scores to rank initiatives. Projects that score high on both scales are ideal candidates for immediate pursuit, as they are likely to deliver significant business value with a reasonable chance of success. Conversely, projects with low feasibility, despite their potential value, may need to be reconsidered or postponed until the necessary conditions are in place.

By taking a methodical approach to prioritisation, you can ensure that your resources are directed towards the most promising GenAI initiatives, leading to more effective and impactful AI adoption.

Conclusion: A Strategic Approach to GenAI Implementation

Successfully implementing Generative AI in your enterprise requires more than just technical expertise—it demands a strategic approach that aligns AI initiatives with your business goals, removes barriers to value capture, mitigates risks, and prioritises projects based on their potential impact. By following the four steps outlined in this guide—establishing a clear vision, removing obstacles, assessing risks, and prioritising initiatives—you can set the stage for a GenAI strategy that drives real, measurable benefits for your organisation.

As with any transformative technology, the key to success lies in careful planning and execution. By learning from the experiences of AI pioneers and applying these best practices, your enterprise can navigate the complexities of GenAI adoption and unlock its full potential to drive innovation and growth.

Navigating the Trough of Disillusionment

A Guide to Sustained Success in Business Vision, Strategy, and Technology Delivery

The Trough of Disillusionment in Business Vision, Strategy, and Technology Delivery

In the dynamic, innovative and interwoven landscape of business and technology, the concept of the “trough of disillusionment” stands as a critical phase that organisations must navigate to achieve long-term success. Coined by the research and advisory firm Gartner, this term is part of the “Hype Cycle,” which describes the typical progression of new technologies from innovation to mainstream adoption. The trough of disillusionment specifically represents a period where inflated expectations give way to a more sober, realistic assessment of a technology’s capabilities and limitations. Understanding this phase is crucial for shaping effective business vision, strategy, and technology delivery.

The Hype Cycle and the Trough of Disillusionment

The Hype Cycle is divided into five key stages:

  1. Innovation Trigger: A breakthrough, product launch, or other event generates significant press and interest.
  2. Peak of Inflated Expectations: Early publicity produces a number of success stories—often accompanied by scores of failures.
  3. Trough of Disillusionment: Interest wanes as experiments and implementations fail to deliver. Producers of the technology shake out or fail. Investments continue only if the surviving providers improve their products to the satisfaction of early adopters.
  4. Slope of Enlightenment: More instances of how the technology can benefit the enterprise start to crystallise and become more widely understood.
  5. Plateau of Productivity: Mainstream adoption starts to take off. Criteria for assessing provider viability are more clearly defined. The technology’s broad market applicability and relevance are clearly paying off.

The Trough of Disillusionment in Business Vision

In the context of business vision, the trough of disillusionment is a reality check that tests the resilience and adaptability of organisational goals. Visionary leaders often set ambitious targets based on the initial promise of new technologies. However, as these technologies face real-world challenges and fail to meet sky-high expectations, the resultant disillusionment can lead to strategic pivoting.

Leaders must anticipate this phase and prepare to manage the potential decline in enthusiasm and support. This involves:

  • Realistic Goal Setting: Establishing achievable milestones and preparing for potential setbacks.
  • Stakeholder Communication: Maintaining transparent communication with stakeholders to manage expectations and reinforce long-term vision despite short-term disappointments.
  • Flexibility and Adaptability: Being ready to pivot strategies based on new insights and developments during the disillusionment phase.

The Trough of Disillusionment in Business Strategy

Strategically, the trough of disillusionment necessitates a recalibration of efforts and resources. Businesses must:

  • Evaluate and Learn: Critically analyse why initial implementations fell short. Was it due to technology immaturity, unrealistic expectations, or lack of necessary infrastructure?
  • Refine Use Cases: Focus on identifying practical, high-value use cases where the technology can realistically deliver benefits.
  • Resource Management: Reallocate resources to areas with a higher likelihood of successful outcomes, potentially slowing down investments in more speculative projects.

Strategists must balance the initial enthusiasm with a grounded approach that incorporates lessons learned during the disillusionment phase. This balanced approach ensures that when the technology matures, the organisation is well-positioned to capitalise on its potential.

The Trough of Disillusionment in Technology Delivery

For technology delivery teams, the trough of disillusionment is a period of introspection and iterative improvement. During this phase, the emphasis shifts from innovation to execution:

  • Improving Product Quality: Focus on addressing the shortcomings of the technology, such as stability, scalability, and usability.
  • Enhanced Training and Support: Providing better training and support for users to maximise the technology’s current capabilities.
  • Incremental Development: Adopting an incremental approach to development, where continuous feedback and iterations help refine the technology and its applications.

Delivery teams must maintain a commitment to excellence and incremental improvement, recognising that sustained effort and adaptation are key to moving through the trough of disillusionment towards the slope of enlightenment.

Conclusion

The trough of disillusionment, while challenging, is a natural and necessary phase in the adoption of new technologies. For businesses, it offers a reality check that can lead to more sustainable, long-term success. By setting realistic expectations, maintaining transparent communication, and being willing to adapt and learn, organisations can navigate this phase effectively. In technology delivery, a focus on incremental improvements and user support ensures that when the technology matures, it can deliver on its early promise. Ultimately, understanding and managing the trough of disillusionment is essential for leveraging new technologies to achieve lasting business success.

Beyond Timelines and Budgets: The Vital Quest for Purpose in Innovation

Building for Impact: The Essential Lesson from Eric Ries

We live in fast evolving world! Within this world of innovation and entrepreneurship, Eric Ries’ poignant question resonates deeply: “What if we found ourselves building something that nobody wanted? In that case, what did it matter if we did it on time and on budget?” This question, at the heart of Ries’ philosophy in the Lean Startup methodology, serves as a critical reminder of the importance of not just building, but building something that matters.

The Pitfall of Misplaced Priorities

In the pursuit of success, it’s easy to get caught up in the metrics that traditionally define progress: adherence to timelines, staying within budget, and completing tasks with precision. Whilst these aspects are undoubtedly important, they risk becoming the sole focus, overshadowing the fundamental question of whether the project or product in development truly meets a need or solves a real problem. Ries challenges us to shift our focus from simply completing tasks to ensuring that what we are building has inherent value and demand.

The Lean Startup Approach

At the core of the Lean Startup methodology is the concept of building, measuring, and learning in rapid, iterative cycles. This approach encourages entrepreneurs and innovators to validate their ideas and assumptions through continuous feedback from their target audience. The goal is to learn what customers really want and need before investing too much time, energy, and resources into a product or service that may not find its market. This philosophy not only saves valuable resources but also steers projects in a direction more likely to achieve meaningful impact.

Illustrating the Impact with Case Studies

  • Dropbox: Before Dropbox became a household name, its founder Drew Houston realised the importance of validating the market need for a cloud storage solution. Initially, instead of fully developing the product, he created a simple video demonstrating how Dropbox would work. The overwhelming positive response to this video was a clear indication of market demand, guiding the team to proceed with confidence. This early validation saved significant resources and positioned Dropbox to meet its users’ real needs effectively.
  • Zappos: Zappos, now a leading online shoe and clothing retailer, began with a simple experiment to test market demand. Founder Nick Swinmurn initially posted pictures of shoes from local stores online without actually holding any inventory. When a pair was ordered, he would purchase it from the store and ship it to the customer. This lean approach to validating customer interest in buying shoes online allowed Zappos to scale confidently, knowing there was a genuine demand for their business model.
  • Pebble Technology: Pebble Technology’s approach to validating the demand for their smartwatch is a modern example of leveraging community support through crowdfunding. Before mass-producing their product, Pebble launched a Kickstarter campaign to gauge interest. The campaign not only surpassed its funding goal but also became one of the most successful Kickstarter campaigns at the time. This validation through crowdfunding underscored the market’s desire for their product, enabling Pebble to proceed with a clear indication of customer demand.

The Importance of Building Something Wanted

The essence of Ries’ question underscores a fundamental truth in both business and personal endeavours: the importance of purpose and relevance. Building something that nobody wants is akin to solving a problem that doesn’t exist—it may be an impressive feat of engineering, creativity, or organisation, but it misses the mark on making a difference in the world. The measure of success, therefore, should not only be in the completion of the project itself but in its ability to address real needs and improve lives.

Embracing Flexibility and Adaptation

Adopting a mindset that prioritises impact over mere completion requires a willingness to be flexible and adapt to feedback. It means being prepared to pivot when data shows that the original plan isn’t meeting the needs of the market. This adaptability is crucial in navigating the unpredictable waters of innovation, where the true north is the value created for others.

Measuring and Evaluating the Relevance

Measuring and evaluating the relevance of a product or service is crucial for ensuring that it meets the actual needs of its intended users and can achieve success in the marketplace. This process involves several strategies and tools designed to gather feedback, analyze market trends, and adjust to user expectations. Below are additional insights on how to effectively carry out this evaluation.

  • 1. Customer Feedback and Engagement
    • Surveys and Questionnaires: Regularly conduct surveys to gather insights directly from your users about their experiences, preferences, and any unmet needs. Tailor these tools to collect specific information that can guide product development and improvement.
    • User Interviews: Conduct in-depth interviews with users to understand their pain points, the context in which they use your product, and their satisfaction levels. These interviews can uncover detailed insights not evident through surveys or data analysis alone.
    • Social Media and Online Forums: Monitor social media platforms and online forums related to your industry. These channels are rich sources of unsolicited feedback and can reveal how users perceive your product and what they wish it could do.
  • Usability Testing
    • Prototype Testing: Before full-scale production, use prototypes to test how potential users interact with your product. Observing users as they navigate a prototype can highlight usability issues and areas for improvement.
    • A/B Testing: Implement A/B testing to compare different versions of your product or its features. This method can help identify which variations perform better in terms of user engagement, satisfaction, and conversion rates.
  • Analyzing Market Trends
    • Competitor Analysis: Keep a close watch on your competitors and their offerings. Understanding their strengths and weaknesses can help you identify gaps in the market and opportunities for differentiation.
    • Market Research Reports: Leverage industry reports and market research to stay informed about broader trends that could impact the relevance of your product. This includes shifts in consumer behavior, technological advancements, and regulatory changes.
  • Metrics and Analytics
    • Usage Metrics: Track how users are interacting with your product through metrics such as daily active users (DAUs), session length, and feature usage rates. These indicators can help you understand which aspects of your product are most valuable to users.
    • Churn Rate: Monitor your churn rate closely to understand how many users stop using your product over a given period. A high churn rate can signal issues with product relevance or user satisfaction.
    • Customer Lifetime Value (CLV): Calculating the CLV provides insights into the long-term value of maintaining a relationship with your customers. This metric helps assess whether your product continues to meet users’ needs over time.
  • Feedback Loops and Continuous Improvement
    • Implement Continuous Feedback Loops: Establish mechanisms to continuously gather and act on feedback. This could involve regular updates based on user input, as well as ongoing testing and iteration of your product.
    • Pivot When Necessary: Be prepared to pivot your product strategy if significant feedback indicates that your product does not meet market needs as expected. Pivoting can involve changing your target audience, adjusting key features, or even redefining your value proposition.

The Ultimate Goal: Making a Difference

Ultimately, the question posed by Eric Ries invites us to reflect on why we embark on the projects we choose. Are we building to simply see our plans materialise, or are we driven by a desire to make a tangible difference in the world? The true reward lies not in the accolades for completing a project on time and within budget but in the knowledge that what we have built serves a greater purpose.

As we navigate the complexities of bringing new ideas to life, let us keep this lesson at the forefront of our minds. By ensuring that what we build is truly wanted and needed, we not only enhance our chances of success but also contribute to a world where innovation and impact go hand in hand.

In conclusion, effectively measuring and evaluating the relevance of a product or service is an ongoing process that requires a combination of direct user engagement, market analysis, and data-driven insights. By staying attuned to the needs and feedback of your users and being willing to adapt based on what you learn, you can ensure that your product remains relevant and successful in meeting the evolving demands of the market.

Embracing Bimodal Model: A Data-Driven Journey for Modern Organisations

With data being the live blood of organisations the emphasis on data management places organisations on a continuous search for innovative approaches to harness and optimise the power of their data assets. In this pursuit, the bimodal model is a well established strategy that can be successfully employed by data-driven enterprises. This approach, which combines the stability of traditional data management with the agility of modern data practices, while providing a delivery methodology facilitating rapid innovation and resilient technology service provision.

Understanding the Bimodal Model

Gartner states: “Bimodal IT is the practice of managing two separate, coherent modes of IT delivery, one focused on stability and the other on agility. Mode 1 is traditional and sequential, emphasising safety and accuracy. Mode 2 is exploratory and nonlinear, emphasising agility and speed.

At its core, the bimodal model advocates for a dual approach to data management. Mode 1 focuses on the stable, predictable aspects of data, ensuring the integrity, security, and reliability of core business processes. This mode aligns with traditional data management practices, where accuracy and consistency are paramount. On the other hand, Mode 2 emphasizes agility, innovation, and responsiveness to change. It enables organizations to explore emerging technologies, experiment with new data sources, and adapt swiftly to evolving business needs.

Benefits of Bimodal Data Management

1. Optimised Performance and Stability: Mode 1 ensures that essential business functions operate smoothly, providing a stable foundation for the organization.

Mode 1 of the bimodal model is dedicated to maintaining the stability and reliability of core business processes. This is achieved through robust data governance, stringent quality controls, and established best practices in data management. By ensuring the integrity of data and the reliability of systems, organizations can optimise the performance of critical operations. This stability is especially crucial for industries where downtime or errors can have significant financial or operational consequences, such as finance, healthcare, and manufacturing.

Example: In the financial sector, a major bank implemented the bimodal model to enhance its core banking operations. Through Mode 1, the bank ensured the stability of its transaction processing systems, reducing system downtime by 20% and minimizing errors in financial transactions. This stability not only improved customer satisfaction but also resulted in a 15% increase in operational efficiency, as reported in the bank’s annual report.

2. Innovation and Agility: Mode 2 allows businesses to experiment with cutting-edge technologies like AI, machine learning, and big data analytics, fostering innovation and agility in decision-making processes.

Mode 2 is the engine of innovation within the bimodal model. It provides the space for experimentation with emerging technologies and methodologies. Businesses can leverage AI, machine learning, and big data analytics to uncover new insights, identify patterns, and make informed decisions. This mode fosters agility by encouraging a culture of continuous improvement and adaptation to technological advancements. It enables organizations to respond quickly to market trends, customer preferences, and competitive challenges, giving them a competitive edge in dynamic industries.

Example: A leading e-commerce giant adopted the bimodal model to balance stability and innovation in its operations. Through Mode 2, the company integrated machine learning algorithms into its recommendation engine. As a result, the accuracy of personalized product recommendations increased by 25%, leading to a 10% rise in customer engagement and a subsequent 12% growth in overall sales. This successful integration of Mode 2 practices directly contributed to the company’s market leadership in the highly competitive online retail space.

3. Enhanced Scalability: The bimodal approach accommodates the scalable growth of data-driven initiatives, ensuring that the organization can handle increased data volumes efficiently.

In the modern data landscape, the volume of data generated is growing exponentially. Mode 1 ensures that foundational systems are equipped to handle increasing data loads without compromising performance or stability. Meanwhile, Mode 2 facilitates the implementation of scalable technologies and architectures, such as cloud computing and distributed databases. This combination allows organizations to seamlessly scale their data infrastructure, supporting the growth of data-driven initiatives without experiencing bottlenecks or diminishing performance.

Example: A global technology firm leveraged the bimodal model to address the challenges of data scalability in its cloud-based services. In Mode 1, the company optimized its foundational cloud infrastructure, ensuring uninterrupted service during periods of increased data traffic. Simultaneously, through Mode 2 practices, the firm adopted containerization and microservices architecture, resulting in a 30% improvement in scalability. This enhanced scalability enabled the company to handle a 50% surge in user data without compromising performance, leading to increased customer satisfaction and retention.

4. Faster Time-to-Insights: By leveraging Mode 2 practices, organizations can swiftly analyze new data sources, enabling faster extraction of valuable insights for strategic decision-making.

Mode 2 excels in rapidly exploring and analyzing new and diverse data sources. This capability significantly reduces the time it takes to transform raw data into actionable insights. Whether it’s customer feedback, market trends, or operational metrics, Mode 2 practices facilitate agile and quick analysis. This speed in obtaining insights is crucial in fast-paced industries where timely decision-making is a competitive advantage.

Example: A healthcare organization implemented the bimodal model to expedite the analysis of patient data for clinical decision-making. Through Mode 2, the organization utilized advanced analytics and machine learning algorithms to process diagnostic data. The implementation led to a 40% reduction in the time required for diagnosis, enabling medical professionals to make quicker and more accurate decisions. This accelerated time-to-insights not only improved patient outcomes but also contributed to the organization’s reputation as a leader in adopting innovative healthcare technologies.

5. Adaptability in a Dynamic Environment: Bimodal data management equips organizations to adapt to market changes, regulatory requirements, and emerging technologies effectively.

In an era of constant change, adaptability is a key determinant of organizational success. Mode 2’s emphasis on experimentation and innovation ensures that organizations can swiftly adopt and integrate new technologies as they emerge. Additionally, the bimodal model allows organizations to navigate changing regulatory landscapes by ensuring that core business processes (Mode 1) comply with existing regulations while simultaneously exploring new approaches to meet evolving requirements. This adaptability is particularly valuable in industries facing rapid technological advancements or regulatory shifts, such as fintech, healthcare, and telecommunications.

Example: A telecommunications company embraced the bimodal model to navigate the dynamic landscape of regulatory changes and emerging technologies. In Mode 1, the company ensured compliance with existing telecommunications regulations. Meanwhile, through Mode 2, the organization invested in exploring and adopting 5G technologies. This strategic approach allowed the company to maintain regulatory compliance while positioning itself as an early adopter of 5G, resulting in a 25% increase in market share and a 15% growth in revenue within the first year of implementation.

Implementation Challenges and Solutions

Implementing a bimodal model in data management is not without its challenges. Legacy systems, resistance to change, and ensuring a seamless integration between modes can pose significant hurdles. However, these challenges can be overcome through a strategic approach that involves comprehensive training, fostering a culture of innovation, and investing in robust data integration tools.

1. Legacy Systems: Overcoming the Weight of Tradition

Challenge: Many organizations operate on legacy systems that are deeply ingrained in their processes. These systems, often built on older technologies, can be resistant to change, making it challenging to introduce the agility required by Mode 2.

Solution: A phased approach is crucial when dealing with legacy systems. Organizations can gradually modernize their infrastructure, introducing new technologies and methodologies incrementally. This could involve the development of APIs to bridge old and new systems, adopting microservices architectures, or even considering a hybrid cloud approach. Legacy system integration specialists can play a key role in ensuring a smooth transition and minimizing disruptions.

2. Resistance to Change: Shifting Organizational Mindsets

Challenge: Resistance to change is a common challenge when implementing a bimodal model. Employees accustomed to traditional modes of operation may be skeptical or uncomfortable with the introduction of new, innovative practices.

Solution: Fostering a culture of change is essential. This involves comprehensive training programs to upskill employees on new technologies and methodologies. Additionally, leadership plays a pivotal role in communicating the benefits of the bimodal model, emphasizing how it contributes to both stability and innovation. Creating cross-functional teams that include members from different departments and levels of expertise can also promote collaboration and facilitate a smoother transition.

3. Seamless Integration Between Modes: Ensuring Cohesion

Challenge: Integrating Mode 1 (stability-focused) and Mode 2 (innovation-focused) operations seamlessly can be complex. Ensuring that both modes work cohesively without compromising the integrity of data or system reliability is a critical challenge.

Solution: Implementing robust data governance frameworks is essential for maintaining cohesion between modes. This involves establishing clear protocols for data quality, security, and compliance. Organizations should invest in integration tools that facilitate communication and data flow between different modes. Collaboration platforms and project management tools that promote transparency and communication can bridge the gap between teams operating in different modes, fostering a shared understanding of goals and processes.

4. Lack of Skillset: Nurturing Expertise for Innovation

Challenge: Mode 2 often requires skills in emerging technologies such as artificial intelligence, machine learning, and big data analytics. Organizations may face challenges in recruiting or upskilling their workforce to meet the demands of this innovative mode.

Solution: Investing in training programs, workshops, and certifications can help bridge the skills gap. Collaboration with educational institutions or partnerships with specialized training providers can ensure that employees have access to the latest knowledge and skills. Creating a learning culture within the organization, where employees are encouraged to explore and acquire new skills, is vital for the success of Mode 2.

5. Overcoming Silos: Encouraging Cross-Functional Collaboration

Challenge: Siloed departments and teams can hinder the flow of information and collaboration between Mode 1 and Mode 2 operations. Communication breakdowns can lead to inefficiencies and conflicts.

Solution: Breaking down silos requires a cultural shift and the implementation of cross-functional teams. Encouraging open communication channels, regular meetings between teams from different modes, and fostering a shared sense of purpose can facilitate collaboration. Leadership should promote a collaborative mindset, emphasizing that both stability and innovation are integral to the organization’s success.

By addressing these challenges strategically, organizations can create a harmonious bimodal environment that combines the best of both worlds—ensuring stability in core operations while fostering innovation to stay ahead in the dynamic landscape of data-driven decision-making.

Case Studies: Bimodal Success Stories

Several forward-thinking organiSations have successfully implemented the bimodal model to enhance their data management capabilities. Companies like Netflix, Amazon, and Airbnb have embraced this approach, allowing them to balance stability with innovation, leading to improved customer experiences and increased operational efficiency.

Netflix: Balancing Stability and Innovation in Entertainment

Netflix, a pioneer in the streaming industry, has successfully implemented the bimodal model to revolutionize the way people consume entertainment. In Mode 1, Netflix ensures the stability of its streaming platform, focusing on delivering content reliably and securely. This includes optimizing server performance, ensuring data integrity, and maintaining a seamless user experience. Simultaneously, in Mode 2, Netflix harnesses the power of data analytics and machine learning to personalize content recommendations, optimize streaming quality, and forecast viewer preferences. This innovative approach has not only enhanced customer experiences but also allowed Netflix to stay ahead in a highly competitive and rapidly evolving industry.

Amazon: Transforming Retail with Data-Driven Agility

Amazon, a global e-commerce giant, employs the bimodal model to maintain the stability of its core retail operations while continually innovating to meet customer expectations. In Mode 1, Amazon focuses on the stability and efficiency of its e-commerce platform, ensuring seamless transactions and reliable order fulfillment. Meanwhile, in Mode 2, Amazon leverages advanced analytics and artificial intelligence to enhance the customer shopping experience. This includes personalized product recommendations, dynamic pricing strategies, and the use of machine learning algorithms to optimize supply chain logistics. The bimodal model has allowed Amazon to adapt to changing market dynamics swiftly, shaping the future of e-commerce through a combination of stability and innovation.

Airbnb: Personalizing Experiences through Data Agility

Airbnb, a disruptor in the hospitality industry, has embraced the bimodal model to balance the stability of its booking platform with continuous innovation in user experiences. In Mode 1, Airbnb ensures the stability and security of its platform, facilitating millions of transactions globally. In Mode 2, the company leverages data analytics and machine learning to personalize user experiences, providing tailored recommendations for accommodations, activities, and travel destinations. This approach not only enhances customer satisfaction but also allows Airbnb to adapt to evolving travel trends and preferences. The bimodal model has played a pivotal role in Airbnb’s ability to remain agile in a dynamic market while maintaining the reliability essential for its users.

Key Takeaways from Case Studies:

  1. Strategic Balance: Each of these case studies highlights the strategic balance achieved by these organizations through the bimodal model. They effectively manage the stability of core operations while innovating to meet evolving customer demands.
  2. Customer-Centric Innovation: The bimodal model enables organizations to innovate in ways that directly benefit customers. Whether through personalized content recommendations (Netflix), dynamic pricing strategies (Amazon), or tailored travel experiences (Airbnb), these companies use Mode 2 to create value for their users.
  3. Agile Response to Change: The case studies demonstrate how the bimodal model allows organizations to respond rapidly to market changes. Whether it’s shifts in consumer behavior, emerging technologies, or regulatory requirements, the dual approach ensures adaptability without compromising operational stability.
  4. Competitive Edge: By leveraging the bimodal model, these organizations gain a competitive edge in their respective industries. They can navigate challenges, seize opportunities, and continually evolve their offerings to stay ahead in a fast-paced and competitive landscape.

Conclusion

In the contemporary business landscape, characterised by the pivotal role of data as the cornerstone of organizational vitality, the bimodal model emerges as a strategic cornerstone for enterprises grappling with the intricacies of modern data management. Through the harmonious integration of stability and agility, organizations can unveil the full potential inherent in their data resources. This synergy propels innovation, enhances decision-making processes, and, fundamentally, positions businesses to achieve a competitive advantage within the dynamic and data-centric business environment. Embracing the bimodal model transcends mere preference; it represents a strategic imperative for businesses aspiring to not only survive but thrive in the digital epoch.

Also read – “How to Innovate to Stay Relevant

Innovation Case Study: Test Automation & Ambit Enterprise Upgrade

A business case of how technology innovation successfully integrated into the business operations an improved the way of working that supported business success.

  
Areas of Science and TechnologyData Engineering, Computer Science
R&D Start DateDec 2018
R&D End DateSeptember 2019
Competent ProfessionalRenier Botha

 

Overview and Available Baseline Technologies

Within the scope of the project, the competent professionals sought to develop a regression testing framework aimed at testing the work carried out to upgrade the Ambit application[1] from a client service solution to a software as a service solution (SaaS) operating in the Cloud. The test framework developed is now used to define and support any testing initiatives across the Bank. The team also sought to automate the process, however this failed due to lack of existing infrastructure in the Bank. 

Initial attempts to achieve this by way of third-party solution providers, such as Qualitest, were unsuccessful, as these providers were unable to develop a framework or methodology which could be documented and reused across different projects. For this the team sought to develop the framework from the ground up. The project was successfully completed in September 2019. 

Technological Advances

The upgrade would enable access to the system via the internet, meaning users would no longer need a Cisco connection onto the specific servers to engage with the application. The upgrade would also enable the system to be accessed from devices other than a PC or laptop. Business Finance at Shawbrook is comprised of 14 different business units, with each unit having a different product which is captured and processed through Ambit. All the existing functionality, and business specific configuration needed to be transferred into the new Enterprise platform, as well as the migration of all the associated data. The competent professionals at Shawbrook sought to appreciably improve the current application through the following technological advances:

  • Development of an Automated Test Framework which could be used across different projects

Comprehensive, well executed testing is essential for mitigating risks to deployment. Shawbrook did not have a documented, standardised, and proven methodology that could be adopted by different projects to ensure that proper testing practises are incorporated into project delivery. There was a requirement to develop a test framework to plan, manage, govern and support testing across the agreed phases, using tools and practices that help mitigate risks in a cost-effective and commensurate way.

The test team sought to develop a continuous delivery framework, which could be used across all units within Business Finance. The Ambit Enterprise Upgrade was the first project at Shawbrook to adopt this framework, which lead to the development of a regression test pack and the subsequent successful delivery of the Ambit upgrade. The Ambit Enterprise project was the first project within the Bank which was delivered with no issues raised post release.

The development of a regression test pack which would enable automated testing of future changes or upgrades to the Ambit platform

Regression testing is a fundamental part of the software development lifecycle. With the increased popularity of the Agile development methodology, regression testing has taken on added importance. The team at Shawbrook sought to adopt an iterative, Agile approach to software development. 

A manual regression test pack was developed which could be used for future testing without the need for the involvement of business users. This was delivered over three test cycles with the team using the results of each cycle (bugs identified and resolved) to issue new releases. 

173 user paths were captured in the regression test pack, across 14 different divisions within Business Finance. 251 issues were found during testing, with some being within the Ambit application. Identifying and resolving these issues resulted in the advancement of Ambit Enterprise platform itself. This regression test pack can now be used for future changes to the Ambit Enterprise application, as well as future FIS[2] releases, change requests and enhancements, without being dependent on the business users to undertake UAT. The competent professionals at Shawbrook are currently using the regression test pack to test the integration functionality of the Ambit Enterprise platform.

  • Development of a costing tool to generate cost estimates for cloud test environment requirements

In order to resolve issues, solutions need to be tested within test environments. A lack of supply was identified within Shawbrook and there was an initiative to increase supply using the Azure cloud environment. The objective was to increase the capability within Business Finance to manage an Azure flexible hosting environment where necessary test environments could be set up on demand. There was also a requirement to plan and justify the expense of test environment management. The competent professionals sought to develop a costing tool, based on the Azure costing model, which could be used by project managers within Business Application Support (“BAS”) to quickly generate what the environment cost would be on a per day or per hour running basis. Costs were calculated based on the environment specification required and number of running hours required. Environment specification was classified as either “high”, “medium” or “low”. For example, the test environment specification required for a web server is low, an application server is medium while a database server is high. Shawbrook gained knowledge and increase its capability of the use of the Azure cloud environment and as a result are actively using the platform to undertake cloud-based testing.

The above constitutes an advance in knowledge and capability in the field of Data Engineering and Computer Science, as per sections 9 a) and c) of the BEIS Guidelines.

Technological Uncertainties and activities carried out to address them

The following technological uncertainties were encountered while developing the Ambit Enterprise upgrade, mainly pertaining to system uncertainty:

  • Implementation of the new Ambit Enterprise application could disrupt existing business processes

The biggest risks for the programme of change, was the potential disruption of existing business processes due to the implementation of the change without validation of the upgraded application against the existing functionality. This was the primary focus of the risk mitigation process for the project. Following the test phases set out in the test framework would enable a clear understanding of all the residual risks encountered approaching implementation, providing stakeholders with the context required to make a calculated judgement on these risks.

When an issue was identified through testing, a triage process was undertaken to categorise the issues as either a technical issue, or a user issue. User issues were further classified as “training” or “change of business process”. Technical issues were classified as “showstoppers”, “high”, “medium” and “low”. These were further categorised by priority as “must haves” and “won’t haves” in order to get well-defined acceptance criteria for the substantial list of bugs that arose from the testing cycles. In total, 251 technical issues were identified.

The acceptance criteria for the resolution of issues were:

  • A code fix was implemented
    • A business approved work around was implemented
    • The business accepted the risk

All showstoppers were resolved with either a code fix or and an acceptable work around. Configuration issues were within the remit of Shawbrook’s business application support (“BAS”) team to resolve, whilst other issues could only be resolved by the FIS development team. When the application went live, there were no issues raised post release, and all issues present were known and met the acceptance criteria of the business. 

  • Business processes may no longer align with the new web-based application

Since the project was an upgrade, there was the potential for operational impact of existing functionality due to differences between the Ambit client server solution, and the upgraded Ambit Enterprise web-based solution. The BAS team at Shawbrook were required to make changes to the business processes in order to align with the way the Ambit Enterprise solution now operated. Where Shawbrook specific issues could not be resolved through the configuration of the application with the business processes, changes were made to the functionality within Ambit, for example, additional plug-ins were developed for the Sales Portal platform to integrate with the Ambit Enterprise application. 

Because Ambit Enterprise was a web-based application, application and security vulnerabilities needed to be identified so that the correct security level was achieved. Because of this, performance and security testing, which was currently not being executed, needed to be introduced to the test framework. Performance testing also needed to be executed so that speed and stability requirements under the expected workloads were met.

Summary and Conclusions

The team at Shawbrook successfully developed a test framework which could be used across all projects within Business Finance. The development of the test framework lead to the generation of a regression test pack for the Ambit Enterprise upgrade. By undertaking these R&D activities, Shawbrook gained knowledge in the use of Azure Cloud Environment for testing, and increased its automated testing capabilities, enabling the transition to a continuous delivery framework whereby the majority of testing is automated.


[1] Ambit is the asset finance application operating within the business unit, 70-80 percent of transactions on all lending is captured and managed through Ambit

[2] FIS is the Ambit Enterprise vendor

Humans are smarter than any type of AI – for now…

Despite all the technological advancements, can machines today only achieve the first two of the thee AI objectives. AI capabilities are at least equalling and in most cases exceeding humans in capturing information and determining what is happening. When it comes to real understanding, machines still fall short – but for how long?

In the blog post, “Artificial Intelligence Capabilities”, we explored the three objectives of AI and its capabilities – to recap:

AI-8Capabilities

  • Capturing Information
    • 1. Image Recognition
    • 2. Speech Recognition
    • 3. Data Search
    • 4. Data Patterns
  • Determine what is happening
    • 5. Language Understanding
    • 6. Thought/Decision Process
    • 7. Prediction
  • Understand why it is happening
    • 8. Understanding

To execute these capabilities, AI are leaning heavily on three technology areas (enablers):

  • Data collecting devices i.e. mobile phones and IoT
  • Processing Power
  • Storage

AI rely on large amounts of data that requires storage and powerful processors to analyse data and calculate results through complex argorythms – resources that were very expensive until recent years. With technology enhancements in machine computing power following Moore’s law and the now mainstream availability of cloud computing & storage, in conjunction with the fact that there are more mobile phones on the planet than humans, really enabled AI to come to forefront of innovation.

AI_takes_over

AI at the forefront of Innovation – Here is some interesting facts to demonstrate this point:

  • Amazon uses machine learning systems to recommend products to customers on its e-commerce platform. AI help’s it determine which deals to offer and when, and influences many aspects of the business.
  • A PwC report estimates that AI will contribute $15.7 trillion to the global economy by 2030. AI will make products and services better, and it’s expected to boost GDP’S globally.
  • The self-driving car market is expected to be worth $127 billion worldwide by 2027. AI is at the heart of the technology to make this happen. NVIDIA created its own computer — the Drive PX Pegasus — specifically for driverless cars and powered by the company’s AI and GPUs. It starts shipping this year, and 25 automakers and tech companies have already placed orders.
  • Scientists believed that we are still years away from AI being able to win at the ancient game of Go, regarded as the most complex human game. Recently Google’s AI recently beat the world’s best Go player.

To date computer hardware followed a growth curve called Moore’s law, in which power and efficiency double every two years. Combine this with recent improvements in software algorithms and the growth is becoming more explosive. Some researchers expect artificial intelligence systems to be only one-tenth as smart as a human by 2035. Things may start to get a little awkward around 2060 when AI could start performing nearly all the tasks humans do — and doing them much better.

Using AI in your business

Artificial intelligence has so much potential across so many different industries, it can be hard for businesses, looking to profit from it, to know where to start.

By understanding the AI capabilities, this technology becomes more accessible to businesses who want to benefit from it. With this knowledge you can now take the next step:

  1. Knowing your business, identify the right AI capabilities to enhance and/or transform your business operations, products and/or services.
  2. Look at what AI vendors with a critical eye, understanding what AI capabilities are actually offered within their products.
  3. Understand the limitations of AI and be realistic if alternative solutions won’t be a better fit.

In a future post we’ll explore some real life examples of the AI capabilities in action.

 

Also read:

How to Innovate to stay Relevant

Staying relevant! The biggest challenge we all face – staying relevant within our market. Relevance to your customers is what keeps you in business.

With the world changing as rapidly as it does today, mainly due to the profound influence of technology on our lives, the expectations of the consumer is changing at pace. They have access to an increasing array of choice, not just in how they spend their money but also in how they are communicating and interacting – change fueled by a digital revolution. The last thing that anyone can afford, in this fast paced race, is losing relevance – that will cost us customers or worse…

Is what you are selling today, adaptable to the continuous changing ecosystems? Does your strategy reflect that agility? How can you ensure that your business stays relevant in the digital age? We have all heard about digital transformation as a necessity, but even then, how can you ensure that you are evolving as fast as your customers and stay relevant within your market?

Business, who has a culture of continuous evolvement, aligning their products and services with the digital driven customer, is the business that stays relevant. This is the kind of business that does not require a digital transformation to realign with customer’s demand to secure their future. A customer centric focus and a culture of continuous evolution within the business, throughout the business value chain, is what assure relevance. Looking at these businesses, their ability/agility to get innovation into production, rapidly, is a core success criterion.

Not having a strategy to stay relevant is a very high and real risk to business. Traditionally we deal with risk by asking “Why?”. For continuous improvement/evolution and agility, we should instead be asking “Why not?” and by that, introduce opportunities for pilots, prototypes, experimentation and proof of concepts. Use your people as an incubator for innovation.

Sure, you have a R&D team and you are continuously finding new ways to deliver your value proposition – but getting your innovative ideas into production is cumbersome, just to discover that it is already aged and possibly absolute in a year a two. R&D is expensive and time consuming and there are no guarantees that your effort will result in a working product or desired service. Just because you have the ability to build something, does not mean that you have to build something. Focusing the scares and expensive resources on the right initiatives makes sense, right! This is why many firms are shifting from a project-minded (short term) approach to a longer-term product-minded investment and management approach.

So, how do you remain customer centric, use your staff as incubators of innovation, select the ideas that will improve your market relevance and then rapidly develop those ideas into revenue earners while shifting to a product-minded investment approach?

You could combine Design Thinking with Lean Startup and Agile Delivery…

In 2016, I was attending the Gartner Symposium where Gartner brought these concepts together very well in this illustration:

Gartner - Design-Lean-Agile 2

Instead of selecting and religiously follow one specific delivery methodology, use the best of multiple worlds to get the optimum output through the innovation lifecycle.

Design-Lean-Agile 1

Using Design Thinking (Empathise >> Define >> Ideate >> Prototype) puts the customer at the core of customer centric innovation and product/service development. Starting by empathising with the customers and defining their most pressing issues and problems, before coming up with a variety of ideas to potentially solve the problems. Each idea is considered before developing a prototype. This dramatically reduces the risk of innovation initiatives, by engaging with what people (the customer) really need and want before actually investing further in development.

Lean Startup focuses on getting a product market fit, by moving a Prototype or MVP (minimum viable product) through a cycle of Build >> Measure >> Learn. This ensures a thorough knowledge of the user of the product/service is gained through an active and measureable engagement with the customer. Customer experience and feedback is captured and used to learn and adapt resulting in an improved MVP, better aligned to the target market, after every cycle.

Finally Agile Scrum, continuing the customer centric theme, involves multiple stakeholders, especially users (customers), in every step in maturing the MVP to a product they will be happy to use. This engagement enhances transparency, which in turn grow the trust between the business (Development Team) and the customer (user) who are vested in the product’s/service’s success. Through an iterative approach, new features and changes can be delivered in an accurate and predictable timeline quickly and according to stakeholder’s priorities. This continuous product/service evolvement, with full stakeholder engagement, builds brand loyalty and ensures market relevance.

Looking at a typical innovation lifecycle you could identify three distinct stages: Idea, Prototype/MVP (Minimal Viable Product) and Product. Each of these innovation stages are complimented by some key value, gained from one of the three delivery methodologies:

Design-Lean-Agile 2

All of these methodologies, engage the stakeholders (especially the customer & user) in continuous feedback loops, measuring progress and capturing feedback to adapt and continuously improve, so maximum value creation is achieved.

No one wants to spend a lot of resource and time delivering something that adds little value and create no impact. Using this innovation methodology and associated tools, you will be building better products and service, in the eye of the user – and that’s what matters. You’ll be actively building and unlocking the potential of you’re A-team, to be involved in creating impact and value while cultivating a culture of continuous improvement.

The same methodology works very well for digital transformation programmes.

At the very least, you should be experimenting with these delivery approaches to find the sweat spot methodology for you.

Experiment to stay relevant!

Let’s Talk – renierbotha.com – Are you looking to develop an innovation strategy to be more agile and stay relevant? Do you want to achieve your goals faster? Create better business value? Build strategies to improve growth?

We can help – make contact!

Read similar articles for further insight in our Blog.

Insightful Quotes on Artificial Intelligence

Artificial Intelligence (AI) today, is a practical reality. It captivated the minds of geniuses and materialised through science fiction as I grew up. During the past 70 years (post WWII) AI has evolved from a philosophical theory to a game changing emerging technology, transforming the way digital enhances value in every aspect of our daily lives.

Great minds have been challenged with the opportunities and possibilities that AI offers.  Here are some things said on the AI subject to date. Within these quotes, the conundrum in people’s minds become clear – does AI open up endless possibilities or inevitable doom?

“I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.”; Alan Turing (1950)

“It seems probable that once the machine thinking method has started, it would not take long to outstrip our feeble powers… They would be able to converse with each other to sharpen their wits. At some stage therefore, we should have to expect the machines to take control.”; Alan Turing

“The study is to proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it. An attempt will be made to find how to make machines use language, form abstractions and concepts, solve kinds of problems now reserved for humans, and improve themselves.”; John McCarthy (1956)

“AI scientists tried to program computers to act like humans without first understanding what intelligence is and what it means to understand. They left out the most important part of building intelligent machines, the intelligence … before we attempt to build intelligent machines we have to first understand how the brain things, and there is nothing artificial about that.”; Jeff Hawkins

“The question of whether a computer can think is no more interesting than the question of whether a submarine can swim.”; Edsger Dijkstra

“Whether we are based on carbon or on silicon makes no fundamental difference; we should each be treated with appropriate respect.”; Arthur Clarke (2010)

“…everything that civilisation has to offer is a product of human intelligence. We cannot predict what we might achieve when this intelligence is magnified by the tools that AI may provide, but the eradication of war, disease, and poverty would be high on anyone’s list. Success in creating AI would be the biggest event in human history.”; Stephen Hawking and colleagues wrote in an article in the Independent

“Why give a robot an order to obey orders—why aren’t the original orders enough? Why command a robot not to do harm—wouldn’t it be easier never to command it to do harm in the first place? Does the universe contain a mysterious force pulling entities toward malevolence, so that a positronic brain must be programmed to withstand it? Do intelligent beings inevitably develop an attitude problem? …Now that computers really have become smarter and more powerful, the anxiety has waned. Today’s ubiquitous, networked computers have an unprecedented ability to do mischief should they ever go to the bad. But the only mayhem comes from unpredictable chaos or from human malice in the form of viruses. We no longer worry about electronic serial killers or subversive silicon cabals because we are beginning to appreciate that malevolence—like vision, motor coordination, and common sense—does not come free with computation but has to be programmed in. …Aggression, like every other part of human behavior we take for granted, is a challenging engineering problem!”; Steven Pinker – How the Mind Works

“Ask not what AI is changing, ask what AI is not changing.”; Warwick Oliver Co-Founder at hut3.ai (2018)

“Sometimes at night I worry about TAMMY. I worry that she might get tired of it all. Tired of running at sixty-six terahertz, tired of all those processing cycles, every second of every hour of every day. I worry that one of these cycles she might just halt her own subroutine and commit software suicide. And then I would have to do an error report, and I don’t know how I would even begin to explain that to Microsoft.”; Charles Yu

“As more and more artificial intelligence is entering into the world, more and more emotional intelligence must enter into leadership.”; Amit Ray

“We’ve been seeing specialized AI in every aspect of our lives, from medicine and transportation to how electricity is distributed, and it promises to create a vastly more productive and efficient economy …”; Barrack Obama

“Artificial intelligence is the future, not only for Russian, but for all of humankind. It comes with colossal opportunities, but also threats that are difficult to predict. Whoever becomes the leader in this sphere will become the ruler of the world.”; Vladimir Putin

“I think we should be very careful about artificial intelligence. If I had to guess at what our biggest existential threat is, I’d probably say that. So we need to be very careful.”; Elon Musk

“Whenever I hear people saying AI is going to hurt people in the future I think, yeah, technology can generally always be used for good and bad and you need to be careful about how you build it … if you’re arguing against AI then you’re arguing against safer cars that aren’t going to have accidents, and you’re arguing against being able to better diagnose people when they’re sick.”; Mark Zuckerberg

“Most of human and animal learning is unsupervised learning. If intelligence was a cake, unsupervised learning would be the cake, supervised learning would be the icing on the cake, and reinforcement learning would be the cherry on the cake. We know how to make the icing and the cherry, but we don’t know how to make the cake. We need to solve the unsupervised learning problem before we can even think of getting to true AI.”; Yan Lecun

“Artificial intelligence would be the ultimate version of Google. The ultimate search engine that would understand everything on the web. It would understand exactly what you wanted and it would give you the right thing. We’re nowhere near doing that now. However, we can get incrementally closer to that, and that is basically what we’re working on.”; Larry Page,  Co-Founder at Google (2000)

If you had all of the world’s information directly attached to your brain, or an artificial brain that was smarter than your brain, you’d be better off.” – Sergey Brin Co-Founder at Goolgle (2004)

 

 

 

Costs reduction initiatives: Retain resources – incubate value innovation

Why is it that technology is always perceived as being too expensive? Do organisations really understand the underlining value technology brings to the business as a foundational enabler? If the answer is yes, then why the continued pressure on Technology Executives to reduce cost? It is interesting that when it comes to cutting cost, business and financial leaders always look at cutting technology resource head count instead of seriously evaluating opportunities to improve productivity and efficiency through value innovation.

In accounting terms there are only two main actions to improve the bottom line – increase Revenue and/or reduce Cost. In technology business operations these two factors can be influenced by several initiatives of which reduction of staff is one option. This should be the last resort, in my view. Despite the known facts that cutting heads in IT, in essence, is cutting intellectual property, knowledge and experience that resides within your team, is it still at top on the list for CFOs, other Executives and Board Members when the cost reduction discussion comes up!

Before we look at reducing the workforce delivering the technology services and products forming the enabling foundation for any organisation, surely we should look at viable alternatives, value innovative initiatives, forthcoming from our staff. Empower your staff to be an incubator for innovation.

Technology operations are all about providing services at a specific level as defined in SLAs (Service Level Agreements) for example:

  • IT infrastructure hosting email, website, file depositories and intranets,
  • Software Development of products the organisation sell to clients and/or use in-house,
  • Implementation, Integration and Customisation projects where software products are deployed,
  • Help/Service Desk supporting IT end-users, etc.

These services are all provided by technologist, by people, and People Come First (Read more…) Focussing on a professional, efficient and happy team by understanding the needs of every individual, goes a long way in ensuring the appropriate initiatives are forthcoming from your staff to make technology more proficient.

One of the key responsibilities of a technology executive is the efficient management of the resources. This is especially important when technology companies/departments are delivering services where the resources are the biggest expense on the technology P&L (Profit & Loss account or Income statement – Read more…).

Resources, as a high expense, reinforce the importance of proper Resource Management in business governance. Resource Management is not only about ensuring the right staff numbers with the right skills sets are available to deliver to business expectation and demand, but it is also about creating the right environment and support to ensure your staff flourish, grow and freely contribute. In my experience are ‘Resource Managers’ far too undervalued by business leaders not understanding the value of the role. Business leaders should work closely with the Resource Managers to ensure their staff is not seen as major expense but as a key asset contributing not only to current business operations but also future business growth and bottom line improvement initiatives.

Business are investing a lot in building teams of highly skilled and motivated people that feel valued and part of something special. These people are driving a clear and larger than themselves vision, that delivers results leading to recognition and self fulfilment. These people are full of innovative ideas on how to improve the business value proposition.

When it comes to resource management, incubating value driven innovation:

  • Ensure you have the right staff. Optimise your recruitment process to ensure that you have a robust framework for bringing the right people for your organisation onboard.
  • Keep your staff happy, mentally stimulated and intellectually engaged in all business processes and services. Make sure they are informed and are actively participating in the decisions driving the business forward.
  • Give them opportunities to learn in their delivery. Good people has a natural urge to continuous improvement – facilitate it.
  • Create communities where staff can learn and share knowledge on a formal and informal basis.
  • Plan your resourcing levels better. Ensure you have the right staff capacity with the right skills to deliver the services to the business demand and expectation.
  • Use flexible resourcing models combining permanent, temporary contracted and outsourced resources.
  • Continuously capture task and productivity data.
  • Utilise analytics, mine the productivity information to give your insight in areas/services costing the most and why. Act on these insights!
  • Build a framework you can use in planning resource capacity forecasting. Work closely with the business to understand the sales pipeline and product development strategy to ensure you optimise your resource capacity with the demand. There is nothing more disruptive to any organisation than constant resource level fluctuation (increase/hiring and decrease/firing) due to poor strategic and project planning.
  • Identify your key resources and nurture them, retain them at all cost – they are the knowledge keepers of your IP (intellectual property). It is cheaper to implement initiatives to retain staff than it is to replace them!
  • People want to feel part of something and if they are happy in their community contributing to a future and in the process they are improving themselves, they are much more likely to stay. Recruitment fees, where staff retention % are low, are a large contributor to cost.

Any cost saving initiative has a fundamentally key measure that needs to answer true: “What is the value to the business?” Revenue and cost do not always define the true value…

What is the true value your staff bring to the success of your business? Have you asked them and really involved them to work with you on ideas to improve business value through innovation rather than cost cutting?

One last point – when you have done your value analysis and it does come to letting staff go, remember this: treat them fare – you never know when you will need them again.

 

Are you under pressure to cut cost? renierbotha ltd specialises in the fine tuning IT operations for optimum business value – Make contact!

Innovation Rewarded

Engineering News – Innovation Reward

Innovation is crucial to industry and economic growth. That is why the South-African Department of Trade and Industry (DTI) has introduced the SPII (Support Programme for Industrial Innovation) initiative to support companies in the develop of new products.

The winners of the 7th SPII Awards was announced in Kimberley in Sep’04…

Customer Feedback Systems won the small enterprise category with its innovative CFS Version 2 Service Tracka project.

The CFS Version 2 “Service Tracka” product is an electronic tool designed to obtain useful and reliable service information with little inconvenience to the customer providing the information. Feedback Systems GM Renier Botha says direct-customer, real-time market intelligence and operational information is an indispensable tool for any business, particularly those involving a large number of individual customers. “Traditional methods include direct or telephone interviews and the completion of forms by customers – all methods that are expensive and time-consuming,” he notes.

With the CFS Version 2 Service Tracka, electronic input devices are located at tills or at strategic positions within a queuing system. The customer enters his responses to questions, without assistance, into the input device. The data is then hosted and managed by CFS and, upon input, is immediately written to a database and results delivered electronically to the client as often as daily. “This enables the client to focus on driving service improvement,” Botha explains.

The CFS system has a number of enhanced features, including a card reader for linking to loyalty programmes, the location of ‘intelligence’ in the central controller, which reduces input device costs, and the activation of the input devices from the central controller, which allows for the reduction of fictitious entries. The provision of question template updates by means of programmable chips in the lid also allows updates to be made before and after deployment of the input devices making question changes easy. Local customers include banks, hotel groups, healthcare, retailers and the South African Revenue Service.

Feedback Systems started developing the CFS Version 2 Service Tracka system at the end of 2001.

The product was subsequently released on the market at the end of January the next year. Today, the company has sold some 4 500 of these systems across South Africa.

Feedback Systems also has distributors located in Zimbabwe (servicing African countries, such as Tanzania, Kenya and Namibia), Australia, Singapore (servicing South East Asia), Dubai (servicing the Middle East) and London (servicing Europe). “We have a total of about 2 000 Service Tracka systems operating in these countries,” Botha says.

The company is also currently negotiating a distributorship deal for the US.

Read the full articel here… Engineering News – Innovation Reward