#TechFieldDayPod

Stephen FoskettStephen@gestaltit.com
2024-12-10

Company Acquisitions are a Necessary Evil in Enterprise Tech

The IT industry’s reliance on acquisitions is a necessary driver of innovation, though they often seem to get in the way of competition and progress. This episode of the Tech Field Day podcast, recorded during Cloud Field Day 21, features Ray Lucchesi, Jon Hildebrand, Ken Nalbone, and Stephen Foskett considering whether acquisitions in the IT industry are a necessary evil or a detriment to innovation. Acquisitions are often seen as a double-edged sword, with both positive and negative implications. On one hand, acquisitions can fuel innovation by providing smaller companies with the resources and market access they need to scale their ideas. On the other hand, they can stifle competition, lead to cultural clashes, and sometimes result in the disappearance of promising technologies or products.

https://youtu.be/Iy5z_Xbaib0

Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

See more from Cloud Field Day 21 on the Tech Field Day website or YouTube channel.

We May Not Like Acquisitions in Tech But We Need them!

The IT industry has long been shaped by the cycle of acquisitions, with large companies absorbing smaller, innovative startups to bolster their portfolios. This practice is often seen as a double-edged sword. On one hand, acquisitions can inject fresh ideas and technologies into established organizations, enabling them to stay competitive in a rapidly evolving market. On the other hand, the process can stifle innovation, as smaller companies with promising technologies are often absorbed and their products either languish or are subsumed into larger, less agile corporate structures. The debate over whether acquisitions are a necessary evil or simply detrimental to the industry remains a contentious topic.

One of the key arguments in favor of acquisitions is their role in fostering innovation. Startups often emerge with groundbreaking ideas but lack the resources or market reach to scale effectively. Being acquired by a larger company can provide the necessary capital, infrastructure, and customer base to bring these innovations to a broader audience. However, this process is not without its pitfalls. Many acquisitions result in a clash of corporate cultures, leading to inefficiencies and, in some cases, the eventual dissolution of the acquired entity’s unique value proposition. This raises questions about whether the industry might benefit more from encouraging organic growth rather than relying on acquisitions as a growth strategy.

Critics argue that acquisitions often prioritize short-term financial gains over long-term innovation. Large corporations may acquire smaller companies not to integrate their technologies but to eliminate potential competition. This practice can lead to market consolidation, reducing diversity and stifling the competitive landscape. Furthermore, the focus on financial returns, driven by venture capital and private equity investments, often pressures startups to position themselves as acquisition targets rather than sustainable, standalone businesses. This dynamic can skew the priorities of emerging companies, emphasizing exit strategies over product development and customer satisfaction.

The role of private equity in driving acquisitions adds another layer of complexity. Private equity firms often seek to maximize returns by cutting costs and streamlining operations, which can lead to a loss of innovation and employee morale within the acquired company. While some private equity firms take a more hands-on approach to foster growth and innovation, others focus solely on financial metrics, potentially undermining the long-term viability of the companies they acquire. This dichotomy highlights the need for a more balanced approach to investment, one that prioritizes sustainable growth and innovation over short-term financial gains.

In an ideal world, the IT industry would thrive on organic growth, with companies building sustainable business models and scaling through customer acquisition and market expansion. However, the reality is that acquisitions are deeply ingrained in the industry’s fabric, driven by the need for rapid growth and the financial incentives of venture capital and private equity. While acquisitions may be a necessary evil in the current landscape, the industry must strive to ensure that they are conducted in a way that fosters innovation, benefits customers, and supports the long-term health of the market. The challenge lies in finding a balance that allows both startups and established companies to thrive without compromising the industry’s overall dynamism.

Podcast Information:

Stephen Foskett is the President of the Tech Field Day Business Unit and Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

Ray Lucchesi is the president of Silverton Consulting and the host of Greybeards on Storage Podcast. You can connect with Ray on X/Twitter or on LinkedIn. Learn more about Ray on his website and listen to his podcast.

Jon Hildebrand is an automation and observability expert. You can connect with Jon on LinkedIn or on X/Twitter. Learn more about Jon by reading his personal blog.

Ken Nalbone is a Senior Solutions Architect at AHEAD. You can connect with Ken on X/Twitter, Bluesky, and on LinkedIn. Learn more about Ken on his personal website.

Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube or your favorite podcast application so you don’t miss an episode and do give us a rating and a review. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group.

#CFD21 #TFDPodcast #GestaltIT #KenNalbone #RayLucchesi #SFoskett #SnoopJ123 #TechFieldDay #TechFieldDayPod

wp.me/p4YpUP-mQu

Stephen FoskettStephen@gestaltit.com
2024-12-03

There’s a Gulf Between Storage and AI

There is a significant gap between storage companies and their ability to effectively support AI infrastructure. In this episode of the Tech Field Day podcast, recorded during the AI Data Infrastructure Field Day 1 in Santa Clara, host Stephen Foskett and guests Kurtis Kemple, Brian Booden, and Rohan Puri explore the evolving relationship between storage and AI. The discussion highlights a significant gap between storage companies’ current capabilities and the demands of AI applications. While storage vendors are pivoting to support AI, many lack deep AI expertise, often focusing on cost and efficiency rather than offering integrated, AI-specific solutions. The panel emphasizes the need for storage companies to move beyond being mere data repositories and instead develop end-to-end solutions that address AI workflows, data preparation, and metadata management. They also stress the importance of education, partnerships, and hiring AI specialists to bridge the knowledge gap and drive innovation. The conversation underscores the early stage of this convergence, with a call for clearer strategies, open standards, and more cohesive integration between storage and AI to meet the growing demands of data-driven applications.

https://youtu.be/3sZoByYja8A

Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

Learn more about AI Data Infrastructure Field Day 1 and watch videos from these presentations on the Tech Field Day website.

How Can Storage Support AI?

The intersection of storage and AI infrastructure presents a complex and evolving challenge. While storage companies are increasingly pivoting toward AI solutions, there remains a significant gap in understanding and integration. Storage has traditionally been viewed as a low-level, technical domain focused on hardware like disks and file systems. On the other hand, AI, particularly in the context of large language models (LLMs) and data analytics, operates at a higher level, requiring nuanced data management and application-specific insights. This disconnect highlights the need for storage companies to move beyond simply offering cost-effective and high-performance infrastructure. Instead, they must develop a deeper understanding of AI workflows and provide solutions that address the specific needs of AI applications, such as data preparation, metadata management, and seamless integration with AI training pipelines.

One of the key challenges is the lack of “solutioning” in the storage industry. Many storage vendors focus on infrastructure performance and efficiency but fail to address how their products fit into the broader AI ecosystem. For instance, while some companies are integrating with GPU technologies to support AI workloads, this approach often stops at the infrastructure level. True integration requires a more comprehensive understanding of AI applications, extending beyond hardware to include data management, insights, and application-level affordances. Without this, storage solutions risk being perceived as generic and interchangeable, reducing their value proposition in the AI space.

Another critical issue is the fragmentation of data sources and the absence of standardized frameworks for integration. Data in AI workflows often comes from diverse sources, including databases, data warehouses, file systems, and cloud storage. These sources are frequently siloed, making it challenging to consolidate and analyze data effectively. While some progress has been made in the database world with open formats and decoupled layers, similar advancements are lacking in the storage domain. The industry needs open standards and protocols that enable seamless data integration across vendors and platforms, facilitating the development of unified AI solutions.

The role of storage companies in AI could evolve in two distinct directions: becoming specialized storage solutions for AI or serving as connectors that enable AI applications to access existing data seamlessly. Both approaches have merit, but they require a clear strategy and a deep understanding of AI workflows. Companies that choose to specialize in AI storage must offer features like automated data preparation, efficient data movement, and real-time insights. Conversely, those opting to act as connectors must focus on breaking down data silos and providing tools that simplify data access and integration for AI applications.

Education and leadership are crucial for bridging the gap between storage and AI. Storage companies need to hire AI specialists and empower them to influence product development and strategy. This requires a top-down approach, with leadership roles dedicated to understanding and addressing the unique challenges of AI. Without this internal expertise, companies risk creating a disconnect between their AI-focused messaging and the actual capabilities of their products. Moreover, fostering collaboration between storage and AI teams within organizations can lead to more innovative and effective solutions.

Finally, the industry is still in the early stages of addressing the intersection of storage and AI. While the rapid growth of data and the increasing complexity of AI workloads present significant challenges, they also offer opportunities for innovation. Storage companies that can adapt to these demands by developing specialized products, embracing open standards, and fostering cross-disciplinary expertise will be better positioned to succeed. As the market matures, we can expect to see a blending of technologies and a shift toward more integrated and user-friendly solutions that cater to the unique needs of AI applications.

Podcast Information:

Stephen Foskett is the President of the Tech Field Day Business Unit and Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

Brian Booden is the Managing Director at DataGlow IT. You can connect with Brian on X/Twitter and on LinkedIn. Learn more about DataGlow IT on their website.

Rohan Puri is an Storage Infrastructure Engineer. You can connect with Rohan on LinkedIn or on X/Twitter. Learn more about him on his personal website.

Kurtis Kemple is the Director of Developer Relations as Slack. You can connect with Kurtis on LinkedIn or on X/Twitter. Learn more about Kurtis on his website.

Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube or your favorite podcast application so you don’t miss an episode and do give us a rating and a review. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group.

#AIDIFD1 #TFDPodcast #BrianBooden #GestaltIT #RohanPuri #SFoskett #TechFieldDay #TechFieldDayPod #_DigitalVandal

wp.me/p4YpUP-mNU

Stephen FoskettStephen@gestaltit.com
2024-09-10

AI Solves All Our Problems

Although AI can be quite useful, it seems that the promise of generative AI has lead to irrational exuberance on the topic. This episode of the Tech Field Day podcast, recorded ahead of AI Field Day, features Justin Warren, Alastair Cooke, Frederic van Haren, and Stephen Foskett considering the promises made about AI. Generative AI was so impressive that it escaped from the lab, being pushed into production before it was ready for use. We are still living with the repercussions of this decision on a daily basis, with AI assistants appearing everywhere. Many customers are already frustrated by these systems, leading to a rapid push-back against the universal use of LLM chatbots. One problem the widespread mis-use of AI has solved already is the search for a driver of computer hardware and software sales, though this already seems to be wearing off. But once we take stock of the huge variety of tools being created, it is likely that we will have many useful new technologies to apply.

https://youtu.be/Ph6ipfZB7z0

Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

Which Problems Does AI Solve?

There is a dichotomy in artificial intelligence (AI) between the hype surrounding generative AI and the practical realities of its implementation. While AI has the potential to address various challenges across industries, the rush to deploy these technologies has often outpaced their readiness for real-world applications. This has led to a proliferation of AI systems that, while impressive in theory, frequently fall short in practice, resulting in frustration among users and stakeholders.

Generative AI, particularly large language models (LLMs), has captured the imagination of marketers and technologists alike. The excitement surrounding these tools has led to their rapid adoption in various sectors, from customer service to content creation. However, this enthusiasm has not been without consequences. Many organizations have integrated AI into their operations without fully understanding its limitations, leading to a backlash against systems that fail to deliver on their promises. The expectation that AI can solve all problems has proven to be overly optimistic, as many users encounter issues with accuracy, reliability, and relevance in AI-generated outputs.

The initial excitement surrounding AI technologies can be likened to previous hype cycles in the tech industry, where expectations often exceed the capabilities of the technology. The current wave of AI adoption is no different, with many organizations investing heavily in generative AI without a clear understanding of its practical applications. This has resulted in a scenario where AI is seen as a panacea for various business challenges, despite the fact that many tasks may be better suited for human intervention or simpler automation solutions.

One of the critical issues with the current AI landscape is the tendency to automate processes that may not need automation at all. This can lead to a situation where organizations become entrenched in inefficient practices, making it more challenging to identify and eliminate unnecessary tasks. The focus on deploying AI as a solution can obscure the need for organizations to critically assess their processes and determine whether they are truly adding value.

Moreover, the rapid pace of AI development raises concerns about the sustainability of these technologies. As companies race to innovate and bring new AI products to market, there is a risk that many of these solutions will not be adequately supported or maintained over time. This could lead to a situation where organizations are left with outdated or abandoned technologies, further complicating their efforts to leverage AI effectively.

Despite these challenges, there is a consensus that AI has the potential to drive significant advancements in various fields. The ability of AI to analyze vast amounts of data and identify patterns can lead to improved decision-making and efficiency in many areas. However, realizing this potential requires a more nuanced understanding of AI’s capabilities and limitations, as well as a commitment to responsible implementation.

The conversation around AI also highlights the importance of data as a critical component of successful AI applications. While the algorithms and models are essential, the quality and relevance of the data fed into these systems are equally crucial. Organizations must prioritize data governance and management to ensure that their AI initiatives yield meaningful results.

As the AI landscape continues to evolve, it is essential for stakeholders to remain vigilant and critical of the technologies they adopt. The promise of AI is significant, but it is vital to approach its implementation with a clear understanding of its limitations and the potential consequences of over-reliance on automated solutions. By fostering a culture of critical thinking and continuous improvement, organizations can better navigate the complexities of AI and harness its potential to drive meaningful change.

Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

Podcast Information:

Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

Alastair Cooke is a CTO Advisor at The Futurum Group. You can connect with Alastair on LinkedIn or on X/Twitter and you can read more of his research notes and insights on The Futurum Group’s website.

Frederic Van Haren is the CTO and Founder at HighFens Inc., Consultancy & ServicesConnect with Frederic on LinkedIn or on X/Twitter and check out the HighFens website

Justin Warren is the Founder and Chief Analyst at PivotNine. You can connect with Justin on X/Twitter or on LinkedIn. Learn more on PivotNine’s website. See Justin’s website to read more.

Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube or your favorite podcast application so you don’t miss an episode and do give us a rating and a review. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group.

#AI #AIFD5 #TFDPodcast #DemitasseNZ #FredericVHaren #GestaltIT #JPWarren #SFoskett #TechFieldDay #TechFieldDayPod

wp.me/p4YpUP-mBm

Stephen FoskettStephen@gestaltit.com
2024-08-27

AI is Not a Fad

The current hype about building massive generative AI models with massive hardware investment is just one aspect of AI. This episode of the Tech Field Day podcast features Frederic Van Haren, Karen Lopez, Marian Newsome, and host Stephen Foskett taking a different perspective on the larger world of AI. Our last episode suggested that AI as it is currently being hyped is a fad, but the bigger world of AI is absolutely real. Large language models are maturing rapidly and even generative AI is getting better by the month, but we are rapidly seeing the reality of the use cases for this technology. All neural networks use patterns in historical data to infer results, so any AI engine could hallucinate. But traditional AI is much less susceptible to errors than the much-hyped generative AI models that are capturing the headlines today. AI is a tool that augments our knowledge and decision making, but it doesn’t replace human intelligence. There is a whole world of AI applications that are productive, responsible, and practical, and these are most certainly not a fad.

https://www.youtube.com/watch?v=BOUTKD8itI4&list=PL4esUX7mpOVYdVlyHmbi5xVGdnRHUa6tH&index=1

Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

AI Field Day 5 is our next AI event happening September 11 through September 13. Check out the event page on the Tech Field Day website for more details.

Look Beyond the Hype: AI is Real!

The current hype surrounding massive generative AI models and the substantial hardware investments they require is just one facet of the broader AI landscape. While the media often focuses on these large language models and the billions of dollars spent on supercomputers to support them, AI encompasses much more than this. The reality is that AI is not a fad – it is a multifaceted tool that is rapidly evolving and finding practical applications across various industries.

AI can be divided into two main phases: training and inference. The training phase involves using extensive datasets and significant computational power, often requiring numerous GPUs, to build models. This phase is typically handled by a few large organizations with the resources to manage such complexity. On the other hand, the inference phase, where these models are applied in real-world scenarios, is less resource-intensive and more accessible to consumers and enterprises. This division highlights that while the development of AI models may be complex and resource-heavy, their application can be straightforward and widely beneficial.

The demand for AI is driven by consumers and enterprises seeking to simplify and enhance their operations. This demand ensures that AI is not a passing trend but a technology with staying power. However, the term “AI” is often used as a catch-all phrase, leading to confusion about its true capabilities and applications. For instance, generative AI, which includes models like ChatGPT, is just one type of AI. These models can produce impressive and convincing outputs but are also prone to errors and “hallucinations”—generating incorrect or nonsensical information based on the data they were trained on.

Traditional AI, which has been in use for years in various industries, is generally more reliable and less prone to such errors. Applications of traditional AI include anomaly detection in manufacturing, video analysis in retail, and security. These use cases demonstrate AI’s practical and responsible applications, which are far from being a fad. For example, AI is used in agriculture to monitor crop health and improve yields, a task that does not require the massive computational resources associated with generative AI.

The perception of AI as a fad is partly due to the overhyped and sometimes half-baked applications of generative AI that capture public attention. These applications often promise more than they can deliver, leading to skepticism. However, the underlying technology of AI is robust and continues to mature, offering valuable solutions in various fields. The speed of innovation in AI is accelerating, and while this can lead to unrealistic expectations, it also means that practical applications are continually emerging.

AI is a tool that augments human knowledge and decision-making rather than replacing it. This distinction is crucial for understanding AI’s role in our lives. For instance, AI can assist in generating documentation, analyzing code, or improving search capabilities within an organization. These applications enhance productivity and efficiency without replacing the need for human oversight and expertise.

The trust factor in AI is also significant. As AI becomes more integrated into everyday technologies, it is essential to market and implement it responsibly. This includes ensuring that AI systems are transparent, reliable, and used ethically. For example, non-generative AI systems, which do not generate new content but analyze existing data, are generally more trustworthy and less prone to errors.

AI is not a fad; it is a powerful tool with a wide range of applications that are already making a significant impact. While the hype around generative AI may lead to some disillusionment, the broader field of AI continues to offer practical, responsible, and valuable solutions. As AI technology evolves, it will become even more integrated into various aspects of our lives, enhancing our capabilities and helping us solve complex problems. The key is to approach AI with a clear understanding of its strengths and limitations, ensuring that it is used to augment human intelligence and decision-making responsibly.

Podcast Information

Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

Frederic Van Haren is the CTO and Founder at HighFens Inc., Consultancy & ServicesConnect with Frederic on LinkedIn or on X/Twitter and check out the HighFens website

Karen Lopez is a Senior Project Manager and Architect at InfoAdvisors. You can connect with Karen on X/Twitter or on LinkedIn.

Marian Newsome is the CEO and Founder of Ethical Tech Matters and a cohost of the Tech Aunties Podcast. You can connect with Marian on LinkedIn. Listen to the Tech Aunties Podcast.

Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube or your favorite podcast application so you don’t miss an episode and do give us a rating and a review. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group.

#AI #TFDPodcast #DataChick #FredericVHaren #GestaltIT #SFoskett #TechFieldDay #TechFieldDayPod #TheFuturumGroup

https://wp.me/p4YpUP-myN

Stephen FoskettStephen@gestaltit.com
2024-08-20

AI as We Know It is Just a Fad

Although AI is certain to transform society, not to mention computing, what we know of it is likely to change and last much longer. This episode of the Tech Field Day podcast brings together Glenn Dekhayser, Alastair Cooke, Allyson Klein, and Stephen Foskett to discuss the real and changing world of AI. Looking at AI infrastructure today, we see massive clusters of GPUs being deployed in the cloud and on-premises to train ever-larger language models, but how much business value do these clusters have long-term? It seems that the true transformation promised by LLM and GenAI will be realized once models are applied across industries with RAG or tuning rather than developing new models. Ultimately AI is a feature of a larger business process or application rather than being a product in itself. We can certainly see that AI-based applications will be transformative, but the vast investment required to build out AI infrastructure to date might never be recouped. Ultimately there is a future for AI, but not the way we have been doing it to date. 

https://www.youtube.com/watch?v=4JqYukIg8Z0&list=PL4esUX7mpOVYdVlyHmbi5xVGdnRHUa6tH&index=1

Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

AI Field Day 5 is our next AI event happening September 11 through September 13. Check out the event page on the Tech Field Day website for more details.

We’re Talking About the Wrong Things When It Comes to AI

The current landscape of artificial intelligence (AI) is undergoing rapid transformation, and what we know of it today may soon be considered outdated. The conversation around AI has shifted significantly, especially with the rise of generative AI, which has captured the public’s imagination and driven massive investments in AI infrastructure. However, the sustainability and long-term business value of these investments are increasingly being questioned.

Initially, the excitement around AI was centered on its integration into various applications, promising to automate and enhance tasks such as log file analysis and predictive maintenance. AI’s ability to process large datasets quickly and identify patterns or anomalies offered clear business benefits, such as reducing unplanned downtime and improving service resolution times. This practical application of AI was seen as a valuable tool for enterprises.

However, the focus has shifted towards generative AI and the development of ever-larger language models. This shift has led to discussions about the power consumption, global trade in GPUs, and the phenomenon of AI “hallucinations”—where AI generates incorrect or nonsensical outputs. These issues pose significant challenges for enterprise IT as they attempt to integrate AI into business processes.

The current approach to AI, characterized by massive GPU clusters and high power consumption, is not seen as sustainable. The investments in AI infrastructure are substantial, with companies spending hundreds of millions of dollars to build single foundational models. This approach is not scalable and does not deliver significant business value to most organizations. The high costs and limited returns suggest that this model of AI development may not be viable in the long term.

There is a growing recognition that AI should be viewed as a feature of larger business processes or applications rather than a standalone product. The true transformation promised by AI will likely be realized when models are applied across industries with techniques such as retrieval-augmented generation (RAG) or fine-tuning existing models, rather than developing new ones from scratch. This approach can provide more immediate and practical business benefits without the need for massive infrastructure investments.

The rapid pace of AI development also means that the technology is constantly evolving. Enterprises are not yet ready for full-scale AI model training, as they often lack the necessary data preparation and infrastructure. Most enterprises are currently using existing models and focusing on RAG or fine-tuning, but even these approaches present challenges. The expectations for AI often exceed the current capabilities, leading to a mismatch between anticipated and actual outcomes.

The future of AI will likely involve more efficient and scalable solutions. Innovations such as on-device inferencing and smaller, more optimized models are already showing promise. These developments could reduce the need for large-scale GPU clusters and make AI more accessible and practical for a wider range of applications.

In conclusion, while AI is certain to transform society and computing, the current approach to AI infrastructure and development is not sustainable. The focus should shift towards integrating AI as a feature within larger business processes and finding more efficient ways to deploy AI technologies. The rapid pace of change in AI means that what we know of it today may soon be considered a fad, but the underlying potential of AI to drive business value and innovation remains strong.

Podcast Information

Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

Alastair Cooke is a CTO Advisor at The Futurum Group. You can connect with Alastair on LinkedIn or on X/Twitter and you can read more of his research notes and insights on The Futurum Group’s website.

Allyson Klein, Global Marketing and Communications Leader and Founder of The Tech Arena. You can connect with Allyson on Twitter or LinkedIn. Find out more information on the Tech Arena website.

Glenn Dekhayser is the Global Principal at Equinix. You can connect with Glenn on LinkedIn and learn more on his website. You can learn more about Equinix on their website.

Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube or your favorite podcast application so you don’t miss an episode and do give us a rating and a review. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group.

#AI #TFDPodcast #DemitasseNZ #GDekhayser #GestaltIT #SFoskett #TechAllyson #TechFieldDay #TechFieldDayPod #TheFuturumGroup

https://wp.me/p4YpUP-myf

Stephen FoskettStephen@gestaltit.com
2024-07-30

The Mainframe is Still Going Strong

Despite the hype about modern applications, the mainframe remains central to enterprise IT and is rapidly adopting new technologies. This episode of the Tech Field Day podcast features Steven Dickens, Geoffrey Decker, and Jon Hildebrand talking to Stephen Foskett about the modern mainframe prior to the SHARE conference. The modern datacenter is rapidly adopting technologies like containerization, orchestration, and artificial intelligence, and these are coming to the mainframe world as well. And the continued importance of mainframe applications, especially in finance and transportation, makes the mainframe more important than ever. There is a tremendous career opportunity in mainframes as well, with recent grads commanding high salaries and working with exciting modern technologies. Modern mainframes run Linux natively, support OpenShift and containers, and support all of the latest languages and programming models in addition to PL1, Cobol, DB2, and of course zOS. We’re looking forward to bringing the latest in the mainframe space from SHARE to our audience.

https://www.youtube.com/watch?v=QFtWQzJmvaY&list=PL4esUX7mpOVYdVlyHmbi5xVGdnRHUa6tH&index=1

Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

For more information on SHARE Kansas City 2024, check out the event page. For more events, head to the Tech Field Day website for details.

The Mainframe Remains Vital in DevOps and AI

Despite the rapid evolution and adoption of modern applications in enterprise IT, the mainframe continues to play a pivotal role, especially in industries such as finance and transportation. The mainframe is not only enduring but also evolving by integrating new technologies like containerization, orchestration, and artificial intelligence. This integration is crucial for maintaining operational resilience, enhancing cybersecurity, and improving application development through DevOps practices.

Mainframes are the backbone of many critical systems, handling vast amounts of transactional data for credit card processing, airline operations, government departments, and tax offices. The reliability and robustness of mainframes in these high-stakes environments underscore their continued relevance. The recent outages experienced by cloud service providers like CrowdStrike and Microsoft highlight the importance of operational resilience, an area where mainframes excel.

The adoption of AI in mainframe environments is particularly noteworthy. AI is being infused into various tools to enhance coding and operational efficiencies. Major players like BMC, IBM, and Broadcom have made significant announcements regarding their AI initiatives, which are aimed at improving the mainframe’s capabilities. The integration of AI allows for real-time decision-making processes, such as fraud detection during credit card transactions, directly within the mainframe environment.

The educational landscape around mainframes is also evolving. Institutions like Northern Illinois University (NIU) are reviving their mainframe curricula to address the growing demand for skilled mainframe developers. Courses in assembler, COBOL, and other mainframe-related subjects are being reintroduced to prepare the next generation of mainframe professionals. Despite the historical decline in mainframe-focused education, the dire need for these skills in the industry is prompting universities to reconsider their course offerings. The career prospects in the mainframe domain are promising. Recent graduates with mainframe skills, particularly in COBOL, are highly sought after by major corporations such as Citibank, Wells Fargo, and Walmart. The salaries for these positions are competitive, often approaching six figures right out of college. This demand is driven by the aging workforce of current mainframe professionals and the critical nature of mainframe applications in enterprise environments.

Technologically, modern mainframes are versatile. They can run multiple operating systems, including Linux distributions like SLES, Debian, RHEL, and Ubuntu, as well as traditional mainframe operating systems like z/OS. This versatility extends to the ability to run containerized applications using platforms like OpenShift directly on the mainframe. This reduces latency and enhances performance by bringing cloud workloads closer to the mainframe’s robust processing capabilities. The mainframe’s ability to handle modern workloads is exemplified by its support for containerized Java applications and the integration of open-source packages like Podman. The hardware accelerators built into mainframes enable the efficient execution of AI workloads, further enhancing their capabilities for modern enterprise needs.

The mainframe ecosystem is also seeing innovative solutions aimed at simplifying development and operations. For instance, companies like Pop-Up Mainframe are making it easier for developers to create and test applications on mainframes without needing extensive mainframe-specific knowledge. This aligns with the broader DevOps movement and facilitates the integration of mainframe environments into modern development workflows.

In summary, the mainframe is far from obsolete. It is a dynamic and evolving platform that continues to be central to enterprise IT. With its adoption of new technologies, robust educational programs, and promising career opportunities, the mainframe is well-positioned to remain a cornerstone of enterprise computing for years to come.

Podcast Information:

Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

Steven Dickens is Chief Technology Officer at The Futurum Group. You can connect with Steven on X/Twitter or on LinkedIn and listen to his frequent appearances on Infrastructure Matters.

Geoffrey Decker is an instructor for mainframe curriculum at Northern Illinois University. You can connect with Geoffrey on X/Twitter or on LinkedIn. Learn more about him on his NIU Faculty Profile.

Jon Hildebrand is an automation and observability expert. You can connect with Jon on LinkedIn or on X/Twitter. Learn more about Jon by reading his personal blog.

Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube or your favorite podcast application so you don’t miss an episode and do give us a rating and a review. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group.

#SHAREkc2024 #TFDPodcast #TFDx #GeoffreyMusik #GestaltIT #SFoskett #SnoopJ123 #StevenDickens3 #TechFieldDay #TechFieldDayPod #TheFuturumGroup

https://wp.me/p4YpUP-mtt

Stephen FoskettStephen@gestaltit.com
2024-07-16

Open Source Helps Small Businesses Modernize Applications

Open-source platforms and managed services are a huge help when it comes to modernizing applications, especially for smaller businesses. This episode of the Tech Field Day podcast, recorded at AppDev Field Day, includes Jack Poller, Stephen Foskett, and Paul Nashawaty discussing the challenges and solutions for small businesses in modernizing applications. Small businesses often face significant challenges when it comes to modernizing their applications, primarily due to limited resources and the complexity of cutting-edge technologies. While larger enterprises might have the capacity to adopt sophisticated technologies like microservices, AI, and advanced security systems, smaller companies struggle to keep pace. However, the availability of open-source technologies and managed services provides a viable pathway for these businesses to modernize incrementally. By leveraging open-source platforms and engaging with managed services, small businesses can modernize their applications without the need for extensive in-house expertise or substantial upfront investment. This approach allows them to progressively adopt new technologies and improve their competitive position in the market.

https://youtu.be/z9dkXp5dT3k

Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

Watch the presentations from AppDev Field Day Event and more on the Tech Field Day website.

Open-source platforms and managed services are increasingly becoming pivotal in aiding small businesses to modernize their applications. This trend is driven by the need for these businesses to update their systems without the heavy financial and resource burdens that typically accompany such transformations. Open-source solutions provide a cost-effective and flexible alternative to proprietary software, offering a wide range of tools and libraries that businesses can adapt to their specific needs. This is particularly beneficial for small businesses that may not have the extensive IT departments or budgets of larger corporations but still need to compete in a technology-driven market.

The use of managed services further complements the advantages offered by open-source technologies. Managed services allow businesses to outsource certain IT functions, such as application management, cloud services, and cybersecurity, to specialized providers. This not only helps small businesses manage costs more effectively by reducing the need for in-house IT staff but also ensures that they have access to the latest technologies and expertise. Managed service providers can offer scalable solutions that grow with the business, ensuring that IT capabilities align with business needs without upfront investments in hardware or software.

One significant challenge for small businesses looking to modernize their applications is the complexity of new technologies. Advanced solutions like microservices architectures, artificial intelligence (AI), and sophisticated security protocols can be daunting. However, open-source communities often provide extensive documentation, user forums, and support that can help small businesses navigate these complexities. By engaging with these communities, small businesses can access a wealth of knowledge and experience, reducing the learning curve associated with new technologies.

Moreover, open-source software often encourages innovation through community collaboration. Small businesses can benefit from the continuous improvements and innovations contributed by developers worldwide. This collaborative approach not only accelerates the development process but also introduces small businesses to best practices and emerging trends in software development.

However, adopting open-source software does come with challenges, such as the need for technical expertise to customize and maintain the software. This is where managed services play a crucial role. By partnering with providers that offer tailored support and services, small businesses can leverage the benefits of open-source software without needing to develop deep technical expertise internally. Managed service providers can handle the complex aspects of software integration, security, and compliance, allowing small businesses to focus on their core operations.

In conclusion, the combination of open-source platforms and managed services provides a powerful pathway for small businesses to modernize their applications. This approach not only helps manage costs and reduce complexity but also enables small businesses to tap into advanced technologies and innovate faster. As the digital landscape continues to evolve, small businesses that leverage these tools effectively will be better positioned to compete and succeed in the modern economy.

Podcast Information:

Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

Paul Nashawaty is a Practice Lead focused on Application Development Modernization at The Futurum Group. You can connect with Paul on LinkedIn and learn more about his research and analysis on The Futurum Group’s website.

Jack Poller is and industry leading cybersecurity analyst and Founder of Paradigm Technica. You can connect with Jack on LinkedIn or on X/Twitter. Learn more on Paradigm Technica’s website.

Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube or your favorite podcast application so you don’t miss an episode and do give us a rating and a review. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group.

#ADFD1 #TFDPodcast #GestaltIT #PNashawaty #Poller #SFoskett #TechFieldDay #TechFieldDayPod

https://wp.me/p4YpUP-mqG

Stephen FoskettStephen@gestaltit.com
2024-07-02

Everything is the Cloud and The Cloud is Everything

The cloud operating model is everywhere these days, and just about everything is now called cloud. This episode of the Tech Field Day podcast, recorded live at Cloud Field Day 20, includes Stephen Foskett, Jeffrey Powers, Alastair Cooke, and Steve Puluka discussing the true meaning of the term cloud computing. Cloud has evolved from its initial definition by NIST in 2012. The cloud concept is ubiquitous, adopted from personal devices to industrial IoT and data centers. The cloud operating model abstracts the complexity of underlying infrastructure, allowing businesses to focus on their core differentiators. But even though the cloud is everywhere, the panelists concluded that while the cloud is everywhere, not everything is the cloud.

https://www.youtube.com/watch?v=qHRTUKgFfiQ&list=PL4esUX7mpOVYdVlyHmbi5xVGdnRHUa6tH&index=1

Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

Watch the presentations and more on the Cloud Field Day 20 Event Page here.

Everything in tech is called “cloud,” from personal devices to industrial IoT, data centers, and beyond. The evolution of cloud computing, which was formally defined by NIST in 2012, has seen the concept permeate various sectors, transforming how services are delivered and consumed. Initially, cloud services were the domain of large data centers operated by companies like AWS, Azure, and Alibaba. However, over the past decade, cloud principles have been adopted by smaller companies, the VMware community, and even personal users, making the cloud a universal operating model rather than just a location.

The core appeal of the cloud lies in its ability to abstract the complexities of underlying infrastructure, allowing businesses to focus on differentiating their services rather than managing hardware intricacies. This shift has led to a significant reduction in the need for detailed knowledge about hardware configurations, as cloud services handle these aspects seamlessly. The cloud operating model enables businesses to allocate resources more efficiently, focusing on application development and operational excellence rather than hardware maintenance.

The future of storage and computing is increasingly leaning towards a combination of cloud services and mobile devices. The younger generation, for instance, is more inclined to use mobile devices for tasks traditionally performed on laptops. This trend is supported by the cloud’s ability to provide a seamless experience across devices, ensuring that data and applications are accessible regardless of the hardware in use. This shift is evident in the rise of devices like Chromebooks, which rely heavily on cloud services for storage and application delivery.

In the enterprise realm, the cloud’s influence is equally profound. While data gravity and latency considerations still necessitate on-premises deployments for certain applications, the cloud operating model is becoming the standard. Modern applications are designed to accommodate the latency and caching mechanisms inherent in cloud environments, enabling seamless operation regardless of the physical location of the infrastructure. Legacy applications, while still present, are gradually being replaced or virtualized to fit into this new paradigm.

The edge, traditionally characterized by proprietary hardware, has also undergone a transformation. Today, edge locations utilize standard servers running virtual machines or containerized applications orchestrated by platforms like Kubernetes. This approach mirrors the cloud operating model, where local servers act as caches for cloud services, ensuring resilience and flexibility. The edge has, in many ways, become more cloud-like than traditional data centers, embracing the principles of abstraction and orchestration.

Despite these advancements, the cloud is not a one-size-fits-all solution. Certain applications, particularly those with stringent latency and data sovereignty requirements, may still necessitate on-premises deployments. However, the overarching trend is towards a cloud-centric model, where infrastructure is managed and consumed as a service, regardless of its physical location. This shift is driven by the need for agility, scalability, and cost-efficiency, which the cloud model inherently provides.

In conclusion, while not everything is the cloud, the cloud is indeed everywhere. It has become the default operating model for modern IT services, extending from personal devices to enterprise data centers and edge locations. The cloud’s principles of abstraction, orchestration, and service-based delivery have permeated all aspects of technology, making it an integral part of the digital landscape. As technology continues to evolve, the cloud will remain a central theme, shaping how services are delivered and consumed in an increasingly connected world.

Podcast Information:

Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

Steve Puluka is an IP architect and a retired network administrator. You can connect with Steve on X/Twitter or on LinkedIn and learn more about him on his website.

Alastair Cooke is a CTO Advisor at The Futurum Group. You can connect with Alastair on LinkedIn or on X/Twitter and you can read more of his research notes and insights on The Futurum Group’s website.

Jeffrey Powers is the Moderator and Lead Tech at Geekazine. You can connect with Jeffrey on X/Twitter or on LinkedIn and learn more about Geekazine on their website.

Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube or your favorite podcast application so you don’t miss an episode and do give us a rating and a review. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group.

#CFD20 #TFDPodcast #DemitasseNZ #Geekazine #SFoskett #SPuluka #TechFieldDay #TechFieldDayPod #TheFuturumGroup

wp.me/p4YpUP-mpk

Stephen FoskettStephen@gestaltit.com
2024-06-25

GenAI is Revolutionizing the Enterprise

Generative AI will revolutionize enterprise IT, but not in the way people expect. This episode of the Tech Field Day podcast includes Stephen Foskett discussing the impact of GenAI with Jack Poller, Calvin Hendryx-Parker, and Josh Atwell at AppDev Field Day. The discussion centered around the potential impact of generative AI on enterprises, debating whether it will significantly transform business operations or merely offer incremental improvements. Generative AI is still in its infancy and may not yet provide revolutionary benefits, but there is great potential for AI in automating tasks and enhancing efficiencies despite challenges in implementation and validation. We must be realistic when it comes to the application of AI in enterprises, and it is important to understand the real capabilities and limitations, and the role of existing vendors in integrating AI functionalities into their products.

https://youtu.be/TzIACeYJp54

Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

Watch the presentations and more on the AppDev Field Day 1 Event Page.

Generative AI (GenAI) is becoming a focal point of discussion across various industries, extending beyond software development and IT into the realms of business, marketing, and executive decision-making. The question remains whether GenAI can substantially impact enterprises or if it remains largely a buzzword with limited practical application.

The current state of GenAI is nascent, primarily enhancing existing processes rather than creating revolutionary changes. Enterprises are contemplating the investment required to integrate GenAI, questioning whether it will merely speed up current operations or genuinely transform business models. The challenge lies in identifying groundbreaking applications that justify the investment in GenAI.

There are compelling arguments for GenAI’s potential benefits within enterprises. For instance, it can automate unit tests, functional tests, and other repetitive tasks, saving significant time and resources. Additionally, GenAI can facilitate complex queries and data analysis, enabling businesses to extract more value from their data. However, these applications often appear as incremental improvements rather than revolutionary changes.

A significant concern for enterprises is the defensive posture they must adopt regarding AI. Employees across various departments are already using public AI frameworks, raising issues about data security and intellectual property. Companies are increasingly looking to develop internal AI models using proprietary data to mitigate these risks, although the timeline for achieving valuable outcomes remains uncertain.

One of the critical advantages of GenAI is its ability to understand and process natural language, which can simplify complex tasks like setting security policies or automating customer service interactions. This capability can reduce the need for extensive manual intervention, potentially decreasing errors and increasing efficiency.

However, the revolutionary impact of GenAI is still debatable. Many enterprises may not fully understand what GenAI entails, often driven by the hype rather than a clear strategy. While GenAI can offer significant improvements in specific areas, such as marketing automation or internal data analysis, these applications might not be as transformative as some might hope.

The integration of GenAI into existing enterprise systems, such as CRM platforms or security frameworks, may offer more immediate and tangible benefits. For example, using GenAI to enhance search capabilities within a company’s knowledge base or to automate routine tasks can provide value without requiring a complete overhaul of existing processes.

Despite the potential benefits, there is a need for enterprises to approach GenAI with a clear understanding of its capabilities and limitations. Investing in GenAI should be driven by well-defined use cases that align with the company’s strategic goals rather than by the desire to follow industry trends.

In conclusion, while GenAI holds promise for enhancing various aspects of enterprise operations, its ability to move the needle significantly is still uncertain. The technology’s true value will likely emerge over time as companies experiment with different applications and integrate GenAI into their broader digital transformation strategies. For now, enterprises should focus on realistic, incremental improvements while keeping an eye on the evolving landscape of GenAI.

Podcast Information:

Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

Jack Poller is and industry leading cybersecurity analyst and Founder of Paradigm Technica. You can connect with Jack on LinkedIn or on X/Twitter. Learn more on Paradigm Technica’s website.

Calvin Hendryx-Parker is the co-founder and CTO of Six Feet Up, a Python development and Cloud consulting company. You can connect with Calvin on LinkedIn or on X/Twitter. Learn more about Six Feet Up on their website.

Josh Atwell is a former data center and cloud automation architect and is now a marketer/consultant. You can connect with Josh in LinkedIn or X/Twitter and learn more about him on his website.

Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube or your favorite podcast application so you don’t miss an episode and do give us a rating and a review. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group.

#ADFD1 #TFDPodcast #CalvinHP #GestaltIT #JoshAtwell #Poller #SFoskett #TechFieldDay #TechFieldDayPod #TheFuturumGroup

https://wp.me/p4YpUP-moD

Stephen FoskettStephen@gestaltit.com
2024-06-11

Cloud Native is Just a Marketing Term

Software developers used to use the term cloud native to describe applications  that are designed for the cloud, but today it seems to be more of a term for containerized applications. This episode of the Tech Field Day podcast, recorded ahead of Cloud Field Day 20, includes Guy Currier, Jack Poller, Ziv Levy, and Stephen Foskett discussing the true meaning of cloud native today. Merely running a monolithic application in containers doesn’t make it cloud native, though it certainly can be beneficial. To be truly cloud native, an application has to be microservices based and scalable, and built to take advantage of modern application platforms and resources. There is some question whether a cloud native application needs to have API access, telemetry and observability, service management, network and storage integration, and security. But ultimately the words used to describe an application are less important than the value and benefits of it. Although it is disappointing that the definition of cloud native has been watered down, the core concepts still have value.

https://youtu.be/hjnHYHPlGVM

Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio

For more information on upcoming events or to watch the presentations, head to the Tech Field Day website and click on the Cloud Field Day 20 event page.

The Dilution and Value of “Cloud Native”

Software developers originally coined the term “cloud native” to describe applications specifically designed for cloud environments. However, over time, this term has evolved, or arguably devolved, into a buzzword often associated with containerized applications.

Initially, “cloud native” was a term to describe applications that were fundamentally designed to leverage cloud resources. These applications were built with the understanding that cloud resources could be ephemeral and not always persistently available. This necessitated a design philosophy that embraced the cloud’s self-service nature and inherent fragility. Fast forward to the present, and “cloud native” is often synonymous with containerization, particularly Kubernetes. This shift has led some to question whether the term retains its original meaning or has been diluted.

In the cybersecurity realm, “cloud native” is frequently used to distinguish applications developed specifically for the cloud from legacy applications that were adapted to run in cloud environments. This distinction is crucial for understanding the capabilities and limitations of a given application. However, the term’s overuse and varied interpretations can lead to confusion, as different vendors and stakeholders may have different definitions of what constitutes a cloud native application.

Within the infrastructure world, “cloud native” has become almost interchangeable with “Kubernetes.” This association is not without merit, as Kubernetes has become a cornerstone of modern cloud infrastructure. However, equating cloud native solely with containerization overlooks the broader architectural principles that truly define cloud native applications.

A critical aspect of cloud native applications is their design around microservices and scalability. Simply running a monolithic application in a container does not make it cloud native. True cloud native applications are built to take full advantage of modern application platforms and resources. This includes being microservices-based, scalable, and capable of leveraging the cloud’s dynamic nature.

There is a question as to whether cloud native applications need to have API access, telemetry, observability, service management, network and storage integration, and robust security. While these features are often associated with cloud native applications, they are not necessarily definitional. For instance, APIs are a standard way to interact with cloud native systems, allowing for automation and scalability. However, an application could theoretically be cloud native without relying heavily on APIs.

A similar question can be raised about telemetry and observability. These features provide critical insights into how an application is performing and behaving in a cloud environment, making them indispensable for managing cloud native applications effectively.

Security and networking are also crucial components. Cloud native applications must be designed with security in mind, leveraging various services and controls to ensure a secure stack. Networking, often overlooked, plays a vital role in ensuring the responsiveness and availability of cloud native applications.

Despite the term’s dilution, the core concepts of cloud native applications—scalability, performance, stability, and predictability—remain valuable. These principles enable the development of applications that can grow and adapt to meet changing demands, providing a significant advantage over traditional monolithic architectures.

Ultimately, while “cloud native” may be used as a marketing term, its true value lies in the benefits it delivers. As long as an application meets the goals of scalability, performance, and resilience, the specific terminology becomes less important. However, it is crucial for vendors and developers to back up their claims with tangible results, ensuring that the term “cloud native” retains its significance in the industry.

Podcast Information:

Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.

Guy Currier is the VP and CTO of Visible Impact, part of The Futurum Group. You can connect with Guy on X/Twitter and on LinkedIn. Learn more about Visible Impact on their website. For more insights, go to The Futurum Group’s website.

Ziv Levy is the Founder and CEO at Cloudsulting, LLC. You can connect with Ziv on X/Twitter or on LinkedIn. Read more about his thoughts through his LinkedIn updates.

Jack Poller is and industry leading cybersecurity analyst and Founder of Paradigm Technica. You can connect with Jack on LinkedIn or on X/Twitter. Learn more on Paradigm Technica’s website.

Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube or your favorite podcast application so you don’t miss an episode and do give us a rating and a review. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group.

#CFD20 #TFDPodcast #GestaltIT #GuyCurriersFeed #Poller #SFoskett #TechFieldDay #TechFieldDayPod #ZivLevy_

https://wp.me/p4YpUP-mlR

2024-04-03

There is a hazardous amount of AI-generated and SEO-oriented content being generated, and the solution is real stories from real communities. In the first episode of Tech Field Podcast, recorded on-site at AI Field Day, Stephen Foskett chats with Frederic Van Haren, Gina Rosenthal and Colleen Coll about confronting inauthentic content.
#ColleenColl #Digi_Sunshine #FredericVHaren #SFoskett #TechFieldDay #TechFieldDayPod #AIFD4 #TFDPodcast Coverage
gestaltit.com/podcast/stephen/

2024-04-03

There is a hazardous amount of AI-generated and SEO-oriented content being generated, and the solution is real stories from real communities #ColleenColl #Digi_Sunshine #FredericVHaren #SFoskett #TechFieldDay #TechFieldDayPod #AIFD4 #TFDPodcast Coverage gestaltit.com/podcast/stephen/

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst