Category: Science, Technology & Security

  • Intelligence: The Crux of Targetted Assassinations

    Intelligence: The Crux of Targetted Assassinations

    The USA has followed targetted assassination strategy since WW II days. It has cooped its allies such as Israel, Australia, and the UK. Targetted killings in the Middle East has been led by Israel with active intelligence support by the USA. With modern ISR capabilities, targets can be monitored or looked for on 24/7 basis for all 365 days of the year. A world that is integrated by common communications protocols and digital standards for ease of normal business becomes vulnerable to intelligence agencies by the same process of commonality. 

     

    On July 31, 2022, Osama bin Laden’s successor as the global head of Al Qaeda, Ayman al-Zawahiri, stepped out onto the balcony of his Taliban safe house in Kabul’s tony Wazir Akbar Khan area to catch a breath of fresh air and a bit of sunshine. About 40,000 feet above an American Predator B (aka Reaper MQ-9) drone, loitering to get a glimpse of him, caught him in its camera, and after its operators in Nevada, USA confirmed it with facial identification technology, ordered it to fire its single Hellfire R9X missile. The Hellfire is a small 100 lbs and five feet long air-to-ground missile (AGM) that races down a reflected laser beam with unerring accuracy. It costs about $ 150,000. The R9X, which was developed at the express request of Barack Obama, who wanted to minimise collateral damage due to an explosive charge, is a kinetic weapon that unsheathes multiple blades from its fuselage as it approaches the target at almost 900 mph like a whirling swordsman. Al-Zawahiri didn’t stand a chance.

    The USA and some other countries have bevvies of space satellites orbiting at preselected trajectories to watch over areas of interest. These satellites not only listen to targets but also track them and identify faces and vehicles by their plates. Osama bin Laden never looked up while on his morning walk at his Abbottabad residence but he was recognised by his height and the length of his shadow at the time of the day and by his gait. Al-Zawahari was either careless or underestimated America’s appetite for his head. He still had a $ 25 million reward, appetising enough for any informants.

    The number of active mobile phones worldwide exceeded 15 billion, which means that many people have more than a couple. Of these 7.2 billion are smartphones connecting people with huge reservoirs of information and content. India has 1.28 billion and China has 1.9 billion phones. The USA follows with 327 million and a dysfunctional country like Pakistan has 125 million. Even in countries with little semblance of a government or a state, like Somalia and Afghanistan or Mali or Libya, there are functioning mobile phone networks.

    As of June 30, 2021, there were about 4.86 billion internet users worldwide. Of these 44.8% were in Asia, 21.5% in Europe and 11.4% in all of North America. India was one of the last countries operating a telegraph service and as of end 2021, even that is in the past. Literally, it’s all up in the air now.

    But since data exchanged on cellular and internet networks fly through the ether and not as pulses racing through copper wires, they are easier to net by electronic interception. But these nets catch them in huge numbers. This is where the supercomputers come in. The messages that are netted every moment are run through sieves of sophisticated and complex computer programs that can simultaneously decode, detect and unravel, and by further analyzing the incoming and outgoing patterns of calls and data transfers for the sending and receiving terminals or phones, can with a fair probability of accuracy tell the agency seeking information about what is going on and who is up to what?

    The problem is that since this information also goes through mobile phone networks and Internet Service Providers (ISP), and the data actually gets decoded from electronic blips into voice and digital data, the private players too can gain access to such information.

    A few years ago we had the case of the infamous Amar Singh CDs, which titillated so many with their graphic content and low-brow conversations featuring the likes of Anil Ambani, Jayaprada, Bipasha Basu and some others. Then we had the episode of the Radia tapes where we were privy to the machinations of Tata’s corporate lobbyist in the national capital fixing policy, positioning ministers and string-pulling media stars. But more useful than this, a mobile phone, by nature of its technology, is also a personalised GPS indicator. It tells them where that phone is at any instant it is on. The Al Qaeda terrorist and US citizen Anwar el-Awlaki was blasted by a Hellfire missile fired from a CIA Predator drone flying over Yemen with the coordinates provided by Awlaki’s mobile phone.

    Since a mobile phone is usually with you it tells the network ( and other interested parties) where you are or were, and even where you are headed. If you are on a certain street since it reveals where exactly you are and the direction of your movement, it can tell you where the next pizza place is or where and what is on sale. This is also a breach of privacy, but often useful to you. But if you are up to no good, then a switched-on mobile phone is a certain giveaway.

    That’s what gave away Osama bin Laden in the end. A momentary indiscretion by a trusted courier and bodyguard and a name gleaned from a long-ago water-boarding session was all it took. To know what happened next see “Zero Dark Thirty” by Katherine Bigelow (now on Netflix and YouTube).

    The NSA is all hi-tech. NSA collects intelligence from four geostationary satellites. These satellites track and monitor millions of conversations and the NSA’s banks of high-speed supercomputers process all these messages for certain phrases and patterns of conversations to decide if the persons at either end were worthy of further interest.

    The NSA’s eavesdropping mission includes radio broadcasting, both from various organisations and individuals, the internet, telephone calls, and other intercepted forms of communication. Its secure communications mission includes military, diplomatic, and all other sensitive, confidential or secret government communications. The NSA is all hi-tech. NSA collects intelligence from four geostationary satellites. These satellites track and monitor millions of conversations and the NSA’s banks of high-speed supercomputers process all these messages for certain phrases and patterns of conversations to decide if the persons at either end were worthy of further interest. Link this information with the data from the CIA’s spinning satellites watching the movements of groups, individuals and vehicles, and you have a broad picture of what the people are doing.

    According to the Washington Post, “every day, collections systems at the National Security Agency intercept and store 1.7 billion e-mails, phone calls and other types of communications.” The NSA and CIA together comprise the greatest intelligence-gathering effort in the world. The overall U.S. intelligence budget is now declared to be $62.8 billion.

     

    Feature Image Credit: E-International Relations

    Article Image Credit: www.twz.com

     

  • UARCs: The American Universities that Produce Warfighters

    UARCs: The American Universities that Produce Warfighters

    America’s military-industrial complex (MIC) has grown enormously powerful and fully integrated into the Department of Defense of the US Government to further its global influence and control. Many American universities have become research centres for the MIC. Similarly, American companies have research programs in leading universities and educational institutions across the world, for example in few IITs in India. In the article below, Dr Sylvia J. Martin explores the role of University Affiliated Research Centers (UARCs) in the U.S. military-industrial complex. UARCs are institutions embedded within universities, designed to conduct research for the Department of Defense (DoD) and other military agencies. The article highlights how UARCs blur the lines between academic research and military objectives, raising ethical questions about the use of university resources for war-related activities. These centres focus on key areas such as nano-technology, immersive simulations, and weapons systems. For example, the University of South California’s Institute for Creative Technologies (ICT) was created to develop immersive training simulations for soldiers, drawing from both science and entertainment, while universities like Johns Hopkins and MIT are involved in anti-submarine warfare and soldier mobility technologies. Sylvia Martin critically examines the consequences of these relationships, particularly their impact on academic freedom and the potential prioritization of military needs over civilian research. She flags the resistance faced by some universities, like the University of Hawai’i, where concerns about militarisation, environmental damage and indigenous rights sparked protests against their UARCs. As UARCs are funded substantially, it becomes a source of major influence on the university. Universities, traditionally seen as centres for open, unbiased inquiry may become aligned with national security objectives, further entrenching the MIC within academics.

    This article was published earlier in Monthly Review.

    TPF Editorial Team

    UARCs: The American Universities that Produce Warfighters

    Dr Sylvia J Martin

    The University of Southern California (USC) has been one of the most prominent campuses for student protests against Israel’s campaign in Gaza, with students demanding that their university “fully disclose and divest its finances and endowment from companies and institutions that profit from Israeli apartheid, genocide, and occupation in Palestine, including the US Military and weapons manufacturing.”

    Students throughout the United States have called for their universities to disclose and divest from defense companies with ties to Israel in its onslaught on Gaza. While scholars and journalists have traced ties between academic institutions and U.S. defense companies, it is important to point out that relations between universities and the U.S. military are not always mediated by the corporate industrial sector.1 American universities and the U.S. military are also linked directly and organizationally, as seen with what the Department of Defense (DoD) calls “University Affiliated Research Centers (UARCs).” UARCs are strategic programs that the DoD has established at fifteen different universities around the country to sponsor research and development in what the Pentagon terms “essential engineering and technology capabilities.”2Established in 1996 by the Under Secretary of Defense for Research and Engineering, UARCs function as nonprofit research organizations at designated universities aimed to ensure that those capabilities are available on demand to its military agencies. While there is a long history of scientific and engineering collaboration between universities and the U.S. government dating back to the Second World War, UARCs reveal the breadth and depth of today’s military-university complex, illustrating how militarized knowledge production emerges from within the academy and without corporate involvement. UARCs demonstrate one of the less visible yet vital ways in which these students’ institutions help perpetuate the cycle of U.S.-led wars and empire-building.

    The University of Southern California (USC) has been one of the most prominent campuses for student protests against Israel’s campaign in Gaza, with students demanding that their university “fully disclose and divest its finances and endowment from companies and institutions that profit from Israeli apartheid, genocide, and occupation in Palestine, including the US Military and weapons manufacturing.”3  USC also happens to be home to one of the nation’s fifteen UARCs, the Institute of Creative Technology (ICT), which describes itself as a “trusted advisor to the DoD.”4  ICT is not mentioned in the students’ statement, yet the institute—and UARCs at other universities—are one of the many moving parts of the U.S. war machine that are nestled within higher education institutions, and a manifestation of the Pentagon’s “mission creep” that encompasses the arts as well as the sciences.5

    Institute of Creative Technologies – military.usc.edu

    Significantly, ICT’s remit to develop dual-use technologies (which claim to provide society-wide “solutions”) entails nurturing what the Institute refers to as “warfighters” for the battlefields of the future, and, in doing so, to increase warfighters’ “lethality.6 Established by the DoD in 1999 to pursue advanced modelling and simulation and training, ICT’s basic and applied research produces prototypes, technologies, and know-how that have been deployed for the U.S. Army, Navy, and Marine Corps. From artificial intelligence-driven virtual humans deployed to teach military leadership skills to futuristic 3D spatial visualization and terrain capture to prepare these military agencies for their operational environments, ICT specializes in immersive training programs for “mission rehearsal,” as well as tools that contribute to the digital innovations of global warmaking.7  Technologies and programs developed at ICT were used by U.S. troops in the U.S.-led Global War on Terror. One such program is UrbanSim, a virtual training application initiated in 2006 designed to improve army commanders’ skills for conducting counterinsurgency operations in Iraq and Afghanistan, delivering fictional scenarios through a gaming experience.8  From all of the warfighter preparation that USC’s Institute researches, develops, prototypes, and deploys, ICT boasts of generating over two thousand academic peer-reviewed publications.

    I encountered ICT’s work while conducting anthropological research on the relationship between the U.S. military and the media entertainment industry in Los Angeles.9  The Institute is located not on the university’s main University Park campus but by the coast, in Playa Vista, alongside offices for Google and Hulu. Although ICT is an approximately thirty-minute drive from USC’s main campus, this hub for U.S. warfighter lethality was enabled by an interdisciplinary collaboration with what was then called the School of Cinema-Television and the Annenberg School for Communications, and it remains entrenched within USC’s academic ecosystem, designated as a unit of its Viterbi School of Engineering, which is located on the main campus.10  Given the presence and power of UARCs at U.S. universities, we can reasonably ask: What is the difference between West Point Military Academy and USC, a supposedly civilian university? The answer, it seems, is not a difference in kind, but in degree. Indeed, universities with UARCs appear to be veritable military academies.

    What Are UARCs?

    UARCs are similar to federally funded research centres such as the Rand Corporation; however, UARCs are required to be situated within a university, which can be public or private.11  The existence of UARCs is not classified information, but their goals, projects, and implications may not be fully evident to the student bodies or university communities in which they are embedded, and there are differing levels of transparency among them about their funding. DoD UARCs “receive sole source funds, on average, exceeding $6 million annually,” and may receive other funding in addition to that from their primary military or federal sponsor, which may also differ among the fifteen UARCs.12  In 2021, funding from federal sources for UARCs ranged “from as much as $831 million for the Johns Hopkins University Applied Physics Lab to $5 million for the University of Alaska Geophysical Detection of Nuclear Proliferation.”13  Individual UARCs are generally created after the DoD’s Under Secretary of Defense for Research and Engineering initiates a selection process for the proposed sponsor, and typically are reviewed by their primary sponsor every five years for renewed contracts.14  A few UARCs, such as Johns Hopkins University’s Applied Physics Lab and the University of Texas at Austin’s Applied Research Lab, originated during the Second World War for wartime purposes but were designated as UARCs in 1996, the year the DoD formalized that status.15

    UARCs are supposed to provide their sponsoring agency and, ultimately, the DoD, access to what they deem “core competencies,” such as MIT’s development of nanotechnology systems for the “mobility of the soldier in the battlespace” and the development of anti-submarine warfare and ballistic and guided missile systems at Johns Hopkins University.16  Significantly, UARCs are mandated to maintain a close and enduring relationship with their military or federal sponsor, such as that of ICT with the U.S. Army. These close relationships are intended to facilitate the UARCs’ “in-depth knowledge of the agency’s research needs…access to sensitive information, and the ability to respond quickly to emerging research areas.”17  Such an intimate partnership for institutions of higher learning with these agencies means that the line between academic and military research is (further) blurred. With the interdisciplinarity of researchers and the integration of PhD students (and even undergraduate interns) into UARC operations such as USC’s ICT, the question of whether the needs of the DoD are prioritized over those of an ostensibly civilian institute of higher learning practically becomes moot: the entanglement is naturalized by a national security logic.

    Table 1 UARCs: The American Universities that Produce Warfighters

    Primary Sponsor University UARC Date of Designation (*original year established)
    Army University of Southern California Institute of Creative Technologies 1999
    Army Georgia Institute of Technology Georgia Tech Research Institute 1996 (*1995)
    Army Massachusetts Institute of Technology Institute for Soldier Nanotechnologies 2002
    Army University of California, Santa Barbara Institute for Collaborative Biotechnologies 2003
    Navy Johns Hopkins University Applied Physics Laboratory 1996 (*1942)
    Navy Pennsylvania State University Applied Research Laboratory 1996 (*1945)
    Navy University of Texas at Austin Applied Research Laboratories 1996 (*1945)
    Navy University of Washington Applied Physics Laboratory 1996 (*1943)
    Navy University of Hawai’i Applied Research Laboratory 2004
    Missile Defense Agency Utah State University Space Dynamics Laboratory 1996
    Office of the Under Secretary of Defense for Intelligence and Security University of Maryland, College Park Applied Research Laboratory for Intelligence and Security 2017 (*2003)
    Under Secretary of Defense for Research and Engineering Stevens Institute of Technology Systems Engineering Research Center 2008
    U.S. Strategic Command University of Nebraska National Strategic Research Institute 2012
    Department of the Assistant Secretary of Defense (Threat Reduction and Control) University of Alaska Fairbanks Geophysical Detection of Nuclear Proliferation 2018
    Air Force Howard University Research Institute for Tactical Autonomy 2023
    Sources: Joan Fuller, “Strategic Outreach—University Affiliated Research Centers,” Office of the Under Secretary of Defense (Research and Engineering), June 2021, 4; C. Todd Lopez, “Howard University Will Be Lead Institution for New Research Center,” U.S. Department of Defense News, January 23, 2023.

    A Closer Look

    The UARC at USC is unique from other UARCs in that, from its inception, the Institute explicitly targeted the artistic and humanities-driven resources of the university. ICT opened near the Los Angeles International Airport, in Marina del Rey, with a $45 million grant, tasked with developing a range of immersive technologies. According to the DoD, the core competencies that ICT offers include immersion, scenario generation, computer graphics, entertainment theory, and simulation technologies; these competencies were sought as the DoD decided that they needed to create more visually and narratively compelling and interactive learning environments for the gaming generation.18  USC was selected by the DoD not just because of the university’s work in science and engineering but also its close connections to the media entertainment industry, which USC fosters from its renowned School of Cinematic Arts (formerly the School of Cinema-Television), thereby providing the military access to a wide range of storytelling talents, from screenwriting to animation. ICT later moved to nearby Playa Vista, part of Silicon Beach, where the military presence also increased; by April 2016, the U.S. Army Research Lab West opened next door to ICT as another collaborative partner, further integrating the university into military work.19  This university-military partnership results in “prototypes that successfully transition into the hands of warfighters”; UARCs such as ICT are thus rendered a crucial link in what graduate student worker Isabel Kain from the Researchers Against War collective calls the “military supply chain.”20

    universities abandon any pretence to neutrality once they are assigned UARCs, as opponents at the University of Hawai’i at Mānoa (UH Mānoa) asserted when a U.S. Navy-sponsored UARC was designated for their campus in 2004. UH Mānoa faculty, students, and community members repeatedly expressed their concerns about the ethics of military research conducted on their campus, including the threat of removing “researchers’ rights to refuse Navy directives”

    USC was touted as “neutral ground” from which the U.S. Army could help innovate military training by one of ICT’s founders in his account of the Institute’s origin story.21  Yet, universities abandon any pretence to neutrality once they are assigned UARCs, as opponents at the University of Hawai’i at Mānoa (UH Mānoa) asserted when a U.S. Navy-sponsored UARC was designated for their campus in 2004. UH Mānoa faculty, students, and community members repeatedly expressed their concerns about the ethics of military research conducted on their campus, including the threat of removing “researchers’ rights to refuse Navy directives.”22  The proposed UARC at UH Mānoa occurred within the context of university community resistance to U.S. imperialism and militarism, which have inflicted structural violence on Hawaiian people, land, and waters, from violent colonization to the 1967 military testing of lethal sarin gas in a forest reserve.23 Hawai’i serves as the base of the military’s U.S. Indo-Pacific Command, where “future wars are in development,” professor Kyle Kajihiro of UH Mānoa emphasizes.24

    Writing in Mānoa Now about the proposed UARC in 2005, Leo Azumbuja opined that “it seems like ideological suicide to allow the Navy to settle on campus, especially the American Navy.”25 A key player in the Indo-Pacific Command, the U.S. Navy has long had a contentious relationship with Indigenous Hawaiians, most recently with the 2021 fuel leakage from the Navy’s Red Hill fuel facility, resulting in water contamination levels that the Hawai’i State Department of Health referred to as “a humanitarian and environmental disaster.”26  Court depositions have since revealed that the Navy knew about the fuel leakage into the community’s drinking water but waited over a week to inform the public, even as people became ill, making opposition to its proposed UARC unsurprising, if not requisite.27  The detonation of bombs and sonar testing that happens at the biennial international war games that the U.S. Navy has hosted in Hawai’i since 1971 have also damaged precious marine life and culturally sacred ecosystems, with the sonar tests causing whales to “swim hundreds of miles, rapidly change their depth (sometimes leading to bleeding from the eyes and ears), and even beach themselves to get away from the sounds of sonar.”28  Within this context, one of the proposed UARC’s core competencies was “understanding of [the] ocean environment.”29

    In a flyer circulated by DMZ Hawaii, UH Mānoa organizers called for universities to serve society, and “not be used by the military to further their war aims or to perfect ways of killing or controlling people.”30  Recalling efforts in previous decades on U.S. campuses to thwart the encroachment of military research, protestors raised questions about the UARC’s accountability and transparency regarding weapons production within the UH community. UH Mānoa’s strategic plan during the time that the Navy’s UARC was proposed and executed (2002–2010) called for recognition of “our kuleana (responsibility) to honour the Indigenous people and promote social justice for Native Hawaiians” and “restoring and managing the Mānoa stream and ecosystem”—priorities that the actions of the U.S. Navy disregarded.31  The production of knowledge for naval weapons within the auspices of this public, land-grant institution disrupts any pretension to neutrality the university may purport.

    while the UH administration claimed that the proposed UARC would not accept any classified research for the first three years, “the base contract assigns ‘secret’ level classification to the entire facility, making the release of any information subject to the Navy’s approval,” raising concerns about academic freedom, despite the fanfare over STEM and rankings

    Further resistance to the UARC designation was expressed by the UH Mānoa community: from April 28 to May 4, 2005, the SaveUH/StopUARC Coalition staged a six-day campus sit-in protest, and later that year, the UH Mānoa Faculty Senate voted 31–18 in favour of asking the administration to reject the UARC designation.32  According to an official statement released by UH Mānoa on January 23, 2006, at a university community meeting with the UH Regents in 2006, testimony from opponents to the UARC outnumbered supporters, who, reflecting the neoliberal turn of universities, expressed hope that their competitiveness in science, technology, engineering, and mathematics (STEM) would advance with a UARC designation, and benefit the university’s ranking.33  Yet in 2007, writing in DMZ Hawaii, Kajihiro clarified that while the UH administration claimed that the proposed UARC would not accept any classified research for the first three years, “the base contract assigns ‘secret’ level classification to the entire facility, making the release of any information subject to the Navy’s approval,” raising concerns about academic freedom, despite the fanfare over STEM and rankings.34  However, the campus resistance campaign was unsuccessful, and in September 2007, the UH Regents approved the Navy UARC designation. By 2008, the U.S. Navy-sponsored Applied Research Laboratory UARC at UH Mānoa opened.

    “The Military Normal”

    Yet with the U.S. creation of the national security state in 1947 and its pursuit of techno-nationalism since the Cold War, UARCs are direct pipelines to the intensification of U.S. empire

    UH Mānoa’s rationale for resistance begs the question: how could this university—indeed, any university—impose this military force onto its community? Are civilian universities within the United States merely an illusion, a deflection from education in the service of empire? What anthropologist Catherine Lutz called in 2009 the ethos of “the military normal” in U.S. culture toward its counterinsurgency wars in Iraq and Afghanistan—the commonsensical, even prosaic perspective on the inevitability of endless U.S.-led wars disseminated by U.S. institutions, especially mainstream media—helps explain the attitude toward this particular formalized capture of the university by the DoD.35  Defense funding has for decades permeated universities, but UARCs perpetuate the military normal by allowing the Pentagon to insert itself through research centres and institutes in the (seemingly morally neutral) name of innovation, within part of a broader neoliberal framework of universities as “engines” and “hubs,” or “anchor” institutions that offer to “leverage” their various forms of capital toward regional development in ways that often escape sustained scrutiny or critique.36  The normalization is achieved in some cases given that UARCs such as ICT strive to serve civilian needs as well as military ones with dual-use technologies and tools. Yet with the U.S. creation of the national security state in 1947 and its pursuit of techno-nationalism since the Cold War, UARCs are direct pipelines to the intensification of U.S. empire. Some of the higher-profile virtual military instructional programs developed at ICT at USC, such as its Emergent Leader Immersive Training Environment (ELITE) system, which provides immersive role-playing to train army leaders for various situations in the field, are funnelled to explicitly military-only learning institutions such as the Army Warrant Officer School.37

    The fifteenth and most recently created UARC, at Howard University in 2023—the first such designation for one of the historically Black colleges and universities (HBCUs)—boasts STEM inclusion

    The military normal generates a sense of moral neutrality, even moral superiority. The logic of the military normal, the offer of STEM education and training, especially through providing undergraduate internships and graduate training, and of course funding, not only rationalizes the implementation of UARCs, but ennobles it. The fifteenth and most recently created UARC, at Howard University in 2023—the first such designation for one of the historically Black colleges and universities (HBCUs)—boasts STEM inclusion.38  Partnering with the U.S. Air Force, Howard University’s UARC is receiving a five-year, $90 million contract to conduct AI research and develop tactical autonomy technology. Its Research Institute for Tactical Autonomy (RITA) leads a consortium of eight other HCBUs. As with the University of Hawai’i, STEM advantages are touted by the UARC, with RITA’s reach expanding in other ways: it plans to supplement STEM education for K–12 students to “ease their path to a career in the fields of artificial intelligence, cybersecurity, tactical autonomy, and machine learning,” noting that undergraduate and graduate students will also be able to pursue fully funded research opportunities at their UARC. With the corporatization of universities, neoliberal policies prioritize STEM for practical reasons, including the pursuit of university rankings and increases in both corporate and government funding. This fits well with increased linkages to the defence sector, which offers capital, jobs, technology, and gravitas. In a critique of Howard University’s central role for the DoD through its new UARC, Erica Caines at Black Agenda Reportinvokes the “legacies of Black resistance” at Howard University in a call to reduce “the state’s use of HBCUs.”39  In another response to Howard’s UARC, another editorial in Black Agenda Report draws upon activist Kwame Ture’s (Stokely Carmichael’s) autobiography for an illuminative discussion about his oppositional approach to the required military training and education at Howard University during his time there.40

    With their respectability and resources, universities, through UARCs, provide ideological cover for U.S. war-making and imperialistic actions, offering up student labour at undergraduate and graduate levels in service of that cover. When nearly eight hundred U.S. military bases around the world are cited as evidence of U.S. empire and the DoD requires research facilities to be embedded within places of higher learning, it is reasonable to expect that university communities—ostensibly civilian institutions—ask questions about UARC goals and operations, and how they provide material support and institutional gravitas to these military and federal agencies.41  In the case of USC, ICT’s stated goal of enhancing warfighter lethality runs counter to current USC student efforts to strive for more equitable conditions on campus and within its larger community (for example, calls to end “land grabs,” and “targeted repression and harassment of Black, Brown and Palestinian students and their allies on and off campus”) as well as other reductions in institutional harms.42  The university’s “Minor in Resistance to Genocide”—a program pursued by USC’s discarded valedictorian Asna Tabassum—also serves as mere cover, a façade, alongside USC’s innovations for warfighter lethality.

    the Hopkins Justice Collective at Johns Hopkins University recently proposed a demilitarization process to its university’s Public Interest Investment Advisory Committee that cited Johns Hopkins’s UARC, Applied Physics Lab, as being the “sole source” of DoD funding for the development and testing of AI-guided drone swarms used against Palestinians in 2021

    Many students and members of U.S. society want to connect the dots, as evident from the nationwide protests and encampments, and a push from within the academy to examine the military supply chain is intensifying. In addition to Researchers Against War members calling out the militarized research that flourishes in U.S. universities, the Hopkins Justice Collective at Johns Hopkins University recently proposed a demilitarization process to its university’s Public Interest Investment Advisory Committee that cited Johns Hopkins’s UARC, Applied Physics Lab, as being the “sole source” of DoD funding for the development and testing of AI-guided drone swarms used against Palestinians in 2021.43  Meanwhile, at UH Mānoa, the struggle continues: in February 2024, the Associated Students’ Undergraduate Senate approved a resolution requesting that the university’s Board of Regents terminate UH’s UARC contract, noting that UH’s own president is the principal investigator for a $75 million High-Performance Computer Center for the U.S. Air Force Research Laboratory that was contracted by the university’s UARC, Applied Research Laboratory.44  Researchers Against War organizing, the Hopkins Justice Collective’s proposal, the undaunted UH Mānoa students, and others help pinpoint the flows of militarized knowledge—knowledge that is developed by UARCs to strengthen warfighters from within U.S. universities, through the DoD, and to different parts of the world.45

    Notes

    1. Jake Alimahomed-Wilson et al., “Boeing University: How the California State University Became Complicit in Palestinian Genocide,” Mondoweiss, May 20, 2024; Brian Osgood, “U.S. University Ties to Weapons Contractors Under Scrutiny Amid War in Gaza,” Al Jazeera, May 13, 2024.
    2. Collaborate with Us: University Affiliated Research Center,” DevCom Army Research Laboratory, arl.devcom.army.mil.
    3. USC Divest From Death Coalition, “Divest From Death USC News Release,” April 24, 2024.
    4. USC Institute for Creative Technologies, “ICT Overview Video,” YouTube, 2:52, December 12, 2023.
    5. Gordon Adams and Shoon Murray, Mission Creep: The Militarization of U.S. Foreign Policy?(Washington DC: Georgetown University Press, 2014).
    6. USC Institute for Creative Technologies, “ICT Overview Video”; USC Institute for Creative Technologies, Historical Achievements: 1999–2019 (Los Angeles: University of Southern California, May 2021), ict.usc.edu.
    7. Yuval Abraham, “‘Lavender’: The AI Machine Directing Israel’s Bombing Spree in Gaza,” +972 Magazine.
    8. “UrbanSim,” USC Institute for Creative Technologies.
    9. Sylvia J. Martin, “Imagineering Empire: How Hollywood and the U.S. National Security State ‘Operationalize Narrative,’” Media, Culture & Society 42, no. 3 (April 2020): 398–413.
    10. Paul Rosenbloom, “Writing the Original UARC Proposal,” USC Institute for Creative Technologies, March 11, 2024.
    11. Susannah V. Howieson, Christopher T. Clavin, and Elaine M. Sedenberg, “Federal Security Laboratory Governance Panels: Observations and Recommendations,” Institute for Defense Analyses—Science and Technology Policy Institute, Alexandria, Virginia, 2013, 4.
    12. OSD Studies and Federally Funded Research and Development Centers Management Office (FFRDC), Engagement Guide: Department of Defense University Affiliated Research Centers (UARCs) (Alexandria, Virginia: OSD Studies and FFRDC Management Office, April 2013), 5.
    13. Christopher V. Pece, “Federal Funding to University Affiliated Research Centers Totaled $1.5 Billion in FY 2021,” National Center for Science and Engineering Statistics, National Science Foundation, 2024, ncses.nsf.gov.
    14. “UARC Customer Funding Guide,” USC Institute for Creative Technologies, March 13, 2024.
    15. Federally Funded Research and Development Centers (FFRDC) and University Affiliated Research Centers (UARC),” Department of Defense Research and Engineering Enterprise, rt.cto.mil.
    16. OSD Studies and FFRDC Management Office, Engagement Guide.
    17. Congressional Research Service, “Federally Funded Research and Development Centers (FFDRCs): Background and Issues for Congress,” April 3, 2020, 5.
    18. OSD Studies and FFRDC Management Office, Engagement Guide, 18.
    19. Institute for Creative Technologies (ICT),” USC Military and Veterans Initiatives, military.usc.edu.
    20. USC Institute for Creative Technologies, Historical Achievements: 1999–2019, 2; Linda Dayan, “‘Starve the War Machine’: Workers at UC Santa Cruz Strike in Solidarity with Pro-Palestinian Protesters,” Haaretz, May 21, 2024.
    21. Richard David Lindholm, That’s a 40 Share!: An Insider Reveals the Origins of Many Classic TV Shows and How Television Has Evolved and Really Works (Pennsauken, New Jersey: Book Baby, 2022).
    22. Leo Azambuja, “Faculty Senate Vote Opposing UARC Preserves Freedom,” Mānoa Now, November 30, 2005.
    23. Deployment Health Support Directorate, “Fact Sheet: Deseret Test Center, Red Oak, Phase I,” Office of the Assistant Secretary of the Defense (Health Affairs), health.mil.
    24. Ray Levy Uyeda, “U.S. Military Activity in Hawai’i Harms the Environment and Erodes Native Sovereignty,” Prism Reports, July 26, 2022.
    25. Azambuja, “Faculty Senate Vote Opposing UARC Preserves Freedom.”
    26. Kyle Kajihiro, “The Militarizing of Hawai’i: Occupation, Accommodation, Resistance,” in Asian Settler Colonialism, Jonathon Y. Okamura and Candace Fujikane, eds. (Honolulu: University of Hawai’i Press, 2008), 170–94; “Hearings Officer’s Proposed Decision and Order, Findings of Fact, and Conclusions of Law,” Department of Health, State of Hawaii vs. United States Department of the Navy, no. 21-UST-EA-02 (December 27, 2021).
    27. Christina Jedra, “Red Hill Depositions Reveal More Details About What the Navy Knew About Spill,” Honolulu Civil Beat, May 31, 2023.
    28. “Does Military Sonar Kill Marine Wildlife?,” Scientific American, June 10, 2009.
    29. Joan Fuller, “Strategic Outreach—University Affiliated Research Centers,” Office of the Under Secretary of Defense (Research and Engineering), June 2021, 4.
    30. DMZ Hawaii, “Save Our University, Stop UARC,” dmzhawaii.org.
    31. University of Hawai’i at Mānoa, Strategic Plan 2002–2010: Defining Our Destiny, 8–9.
    32. Craig Gima, “UH to Sign Off on Navy Center,” Star Bulletin, May 13, 2008.
    33. University of Hawai’i at Mānoa, “Advocates and Opponents of the Proposed UARC Contract Present Their Case to the UH Board of Regents,” press release, January 23, 2006.
    34. Kyle Kajihiro, “The Secret and Scandalous Origins of the UARC,” DMZ Hawaii, September 23, 2007.
    35. Catherine Lutz, “The Military Normal,” in The Counter-Counterinsurgency Manual, or Notes on Demilitarizing American Society, The Network of Concerned Anthropologists, ed. (Chicago: Prickly Paradigm Press, 2009).
    36. Anne-Laure Fayard and Martina Mendola, “The 3-Stage Process That Makes Universities Prime Innovators,” Harvard Business Review, April 19, 2024; Paul Garton, “Types of Anchor Institution Initiatives: An Overview of University Urban Development Literature,” Metropolitan Universities 32, no. 2 (2021): 85–105.
    37. Randall Hill, “ICT Origin Story: How We Built the Holodeck,” Institute for Creative Technologies, February 9, 2024.
    38. Brittany Bailer, “Howard University Awarded $90 Million Contract by Air Force, DoD to Establish First-Ever University Affiliated Research Center Led by an HCBU,” The Dig, January 24, 2023, thedig.howard.edu.
    39. Erica Caines, “Black University, White Power: Howard University Covers for U.S. Imperialism,” Black Agenda Report, February 1, 2023.
    40. Editors, “Howard University: Every Black Thing and Its Opposite, Kwame Ture,” The Black Agenda Review (Black Agenda Report), February 1, 2023.
    41. David Vine, Base Nation: How U.S. Military Bases Abroad Harm America and the World (New York: Metropolitan Books, 2015).
    42. USC Divest from Death Coalition, “Divest From Death USC News Release”; “USC Renames VKC, Implements Preliminary Anti-Racism Actions,” Daily Trojan, June 11, 2020.
    43. Hopkins Justice Collective, “PIIAC Proposal,” May 4, 2024.
    44. Bronson Azama to bor.testimony@hawaii.edu, “Testimony for 2/15/24,” February 15, 2024, University of Hawai’i; “UH Awarded Maui High Performance Computer Center Contract Valued up to $75 Million,” UH Communications, May 1, 2020.
    45. Isabel Kain and Becker Sharif, “How UC Researchers Began Saying No to Military Work,” Labor Notes, May 17, 2024.

     

    Feature Image: Deep Space Advanced Radar Capability (DARC) at Johns Hopkins Advanced Physical Laborotory, A UARC facility – www.jhuapl.edu

  • China’s Role in reducing the Global Carbon Footprint: The 2060 Promise and Geopolitics on the Climate Front

    China’s Role in reducing the Global Carbon Footprint: The 2060 Promise and Geopolitics on the Climate Front

    Introduction

    The devastating role carbon plays in climate change cannot be underestimated. The rise in global surface temperatures, air pollution, and sea levels are visible effects of a rapidly changing environment. China, the world’s second most populous country, is also the largest emitter of greenhouse gases[i]. According to the CAIT database, in 2020, China emitted what amounted to 27% of the total greenhouse gas emissions in the world[ii]. Under President Xi Jinping, China has moved to position itself as an “ecological civilization”, striving to advance its role in global climate protection[iii]. China’s endeavours received acclaim when it became one of the first major countries to ratify the Paris Agreement in 2015, pledging to attain peak emissions by 2030 and net zero carbon emissions by 2060. This article aims to delineate China’s strategies and motivations for addressing carbon emissions and contrast these with the measures implemented by Western and developing countries to diminish their carbon footprint.

    China’s Image and Geopolitics in the Climate Sector

    Considering China’s position on the world stage as one of the largest and fastest-growing economies in the world, it has faced international pressure to take accountability for its contribution to climate change. China has previously argued that as a developing country, it should not have to share the same responsibilities of curbing climate change that developed countries, whose emissions went “unchecked for decades”, have[iv]. Nonetheless, they have pledged to lead by example in the climate sector. A large part of President Xi’s campaign to amplify China’s climate ambitions may come from appeasing the West while also setting up leadership in the clean energy sector to better cement its role as a superpower. According to a New York Times article, their promise to contribute to climate protection could be used to soothe the international audience and to counterbalance the worldwide anger that China faces over their oppression of the Uyghur Muslims in the Xinjiang province and their territorial conflicts in the South China Sea and Taiwan[v]. President Xi’s pledge at the UN to reach peak emissions before 2030 may have been an attempt to depict China as a pioneering nation striving to achieve net zero carbon emissions, serving as an alternative powerful entity for countries to turn to in lieu of the United States. This holds particular significance, as the USA remained mute about taking accountability for its own carbon emissions and withdrew from the Paris Agreement during Donald Trump’s presidency[vi]. This also shows China’s readiness to employ the consequences of climate change on its geopolitical agenda[vii].

    The future actions of China may significantly influence the climate policies of both developing and developed nations, potentially establishing China as a preeminent global force in climate change mitigation.

    China has endeavoured to shape its image in the climate sector. In 2015, despite being classified as a developing country, China refrained from requesting climate finance from developed countries and instead pledged $ 3.1 billion in funding to assist other developing countries in tackling climate change[viii]. As per the World Bank’s Country Climate and Development Report for China, China is poised to transform “climate action into economic opportunity.”[ix] By transitioning to a net zero carbon emissions economy, China can generate employment opportunities while safeguarding its non-renewable resources from depletion. China’s economy is also uniquely structured to seize the technological and reputational benefits of early climate action[x]. The future actions of China may significantly influence the climate policies of both developing and developed nations, potentially establishing China as a preeminent global force in climate change mitigation. Nonetheless, if China fails to fulfil its commitment to attain net zero carbon emissions by 2060, it may suffer substantial reputational damage, particularly given its current status as a pioneer in “advancing low carbon energy supply”[xi].

    Domestic Versus International Efforts in the Clean Energy Race

    However, domestic and international factors could affect China’s goal to peak emissions and the deadlines it has set for itself. A global event that may have affected their efforts to peak carbon emissions was the COVID-19 pandemic, in which the rise in carbon emissions from industries and vehicles was interrupted[xii]. However, after the pandemic, China’s economy saw swift growth, and in 2021, China’s carbon emissions were 4% higher than in the previous year[xiii]. Not only is China back on track to peak carbon emissions by 2030, but the International Energy Agency and World Energy Outlook 2023 also found that “China’s fossil fuel use will peak in 2024 before entering structural decline.”[xiv]

    Although China’s industrial sector is heavily reliant on coal and fossil fuels, it also boasts the world’s largest production of electric vehicles and is a leader in manufacturing solar panels and wind turbines[xv]. In contrast, developed countries, particularly the US, which withdrew from the Paris Agreement in 2017 during the Trump presidency, appear to be making less of an effort towards environmental protection.

    Developing countries, while not entirely possessed of the immense sprawl of China’s economy and population, are nonetheless not at the level of transitioning to clean energy that China is. India, too, has pledged to be carbon neutral by 2070 and to have emissions peak by 2030. Given its increasing economic growth rate, India must decrease its carbon intensity at the same pace. India lags behind China when it comes to manufacturing solar panels and other renewable energy sources. India’s central government is preparing to push energy modernization to “align with global energy transition trends.”[xvi] According to the Economic Times, particular emphasis has been laid on renewable energy sources like solar capacity and e-vehicles in the 2024-25 budget.[xvii]

    China and International Cooperation for Climate Protection

    With China producing sufficient solar capacity in 2022 to lead the rest of the world considerably and the deployment of solar power expected to rise until 2028, it is essential that the West does not make the mistake of isolating China

    Given that China has emerged as the leading manufacturer of electric vehicles (EVs), it remains to be seen whether developed and developing countries will leverage their supply chains to combat their own climate crises. While opportunities are plentiful for Western businesses to integrate with China’s cutting-edge alternatives for traditional energy sources, the United States has adopted a hardline stance towards China[xviii]. The US has imposed 100 per cent tariffs on Chinese-made e-vehicles, and solar cells face tariffs at 50 per cent.[xix] Simultaneously, rivalry and competition between the two countries on the climate front may help combat the climate dilemma and ever-increasing carbon emissions by avoiding the collective action problem. However, this will depend heavily on smooth cooperation and effective communication between Chinese authorities and developed nations within the EU and the USA[xxi]. Empowering domestic groups within countries can raise awareness of climate crises. A poll conducted in China revealed that 46% of the youth considered climate change the “most serious global issue.”[xxii] According to a survey conducted by the United Nations, 80% of people worldwide say they want climate action[vii]. With China producing sufficient solar capacity in 2022 to lead the rest of the world considerably and the deployment of solar power expected to rise until 2028, it is essential that the West does not make the mistake of isolating China[xxiii].

    Conclusion

    China has a significant advantage in its renewable energy sector. Western countries and other developing economies rely heavily on China’s green exports to address climate change urgently. China’s stringent measures to curb emissions from its coal-based industries and the growing output from its alternative energy sources reflect its proactive stance in becoming a global leader in addressing climate change — a position that surpasses other nations’ efforts. While it is debatable whether China’s commitment to reduce its carbon emissions was a political strategy to appease Europe, it is undeniable that tackling climate change is a pressing issue. With the public’s overwhelming support for implementing change in the climate sector, governments worldwide must prioritise their citizens’ needs and cooperate to develop policies that ensure a sustainable future for our planet.

     

    Notes:

    [i] Saurav Anand, “Solar Capacity, EVs, and Nuclear SMRs to Get Budget Boost for Energy Security – ET EnergyWorld,” ETEnergyworld.com, July 11, 2024, https://energy.economictimes.indiatimes.com/news/renewable/solar-capacity-evs-and-nuclear-smrs-to-get-budget-boost-for-energy-security/111648384?action=profile_completion&utm_source=Mailer&utm_medium=newsletter&utm_campaign=etenergy_news_2024-07-11&dt=2024-07-11&em=c2FuYS5zYXByYTIyMUBnbWFpbC5jb20.

    [ii]Saurav Anand, “Solar Capacity, EVs, and Nuclear Smrs to Get Budget Boost for Energy Security – ET EnergyWorld,” ETEnergyworld.com, July 11, 2024, https://energy.economictimes.indiatimes.com/news/renewable/solar-capacity-evs-and-nuclear-smrs-to-get-budget-boost-for-energy-security/111648384?action=profile_completion&utm_source=Mailer&utm_medium=newsletter&utm_campaign=etenergy_news_2024-07-11&dt=2024-07-11&em=c2FuYS5zYXByYTIyMUBnbWFpbC5jb20.

    [iii]Shameem Prashantham and Lola Woetzel, “To Create a Greener Future, the West Can’t Ignore China,” Harvard Business Review, April 10, 2024, https://hbr.org/2024/05/to-create-a-greener-future-the-west-cant-ignore-china.

    [iv]“Fact Sheet: President Biden Takes Action to Protect American Workers and Businesses from China’s Unfair Trade Practices,” The White House, May 14, 2024, https://www.whitehouse.gov/briefing-room/statements-releases/2024/05/14/fact-sheet-president-biden-takes-action-to-protect-american-workers-and-businesses-from-chinas-unfair-trade-practices/?utm_source=dailybrief&utm_medium=email&utm_campaign=DailyBrief2024May14&utm_term=DailyNewsBrief.

    [v]Noah J. Gordon et al., “Why US-China Rivalry Can Actually Help Fight Climate Change,” Internationale Politik Quarterly, March 24, 2023, https://ip-quarterly.com/en/why-us-china-rivalry-can-actually-help-fight-climate-change.

    [vi] Simon Evans Hongqiao Liu, “The Carbon Brief Profile: China,” Carbon Brief, November 30, 2023, https://interactive.carbonbrief.org/the-carbon-brief-profile-china/.

    [vii]“Climatechange,” United Nations, accessed July 18, 2024, https://www.un.org/en/climatechange#:~:text=The%20world’s%20largest%20standalone%20public,to%20tackle%20the%20climate%20crisis.

    [viii]Martin Jacques, “China Will Reach Climate Goal While West Falls Short,” Global Times, accessed July 19, 2024, https://www.globaltimes.cn/page/202402/1306788.shtml#:~:text=There%20has%20been%20constant%20low,than%202050%20for%20carbon%20zero.

    [ix] Steven Lee Myers, “China’s Pledge to Be Carbon Neutral by 2060: What It Means,” The New York Times, September 23, 2020

    [x] Simon Evans, Hongqiao Liu et al, “The Carbon Brief Profile: China,” Carbon Brief, November 30, 2023, https://interactive.carbonbrief.org/the-carbon-brief-profile-china/.

    [xi] China | nationally determined contribution (NDC), accessed July 17, 2024, https://www.climatewatchdata.org/ndcs/country/CHN?document=revised_first_ndc.

    [xii] Simon Evans, Hongqiao Liu et al, “The Carbon Brief Profile: China,” Carbon Brief, November 30, 2023, https://interactive.carbonbrief.org/the-carbon-brief-profile-china/.

    [xiii] Steven Lee Myers, “China’s Pledge to Be Carbon Neutral by 2060: What It Means,” The New York Times, September 23, 2020,https://www.nytimes.com/2020/09/23/world/asia/china-climate-change.html.

    [xiv] Steven Lee Myers, “China’s Pledge to Be Carbon Neutral by 2060: What It Means,” The New York Times, September 23, 2020, https://www.nytimes.com/2020/09/23/world/asia/china-climate-change.html.

    [xv] Simon Evans, Hongqiao Liu et al, “The Carbon Brief Profile: China,” Carbon Brief, November 30, 2023, https://interactive.carbonbrief.org/the-carbon-brief-profile-china/.

    [xvi] Matt McGrath, “Climate Change: China Aims for ‘Carbon Neutrality by 2060,’” BBC News, September 22, 2020, https://www.bbc.com/news/science-environment-54256826.

    [xvii] Simon Evans, Hongqiao Liu et al, “The Carbon Brief Profile: China,” Carbon Brief, November 30, 2023, https://interactive.carbonbrief.org/the-carbon-brief-profile-china/.

    [xviii] World Bank Group, “China Country Climate and Development Report,” Open Knowledge Repository, October 2022, https://openknowledge.worldbank.org/entities/publication/ef01c04f-4417-51b6-8107-b688061a879e.

    [xix] World Bank Group, “China Country Climate and Development Report,” Open Knowledge Repository, October 2022, https://openknowledge.worldbank.org/entities/publication/ef01c04f-4417-51b6-8107-b688061a879e.

    [xx] World Bank Group, “China Country Climate and Development Report,” Open Knowledge Repository, October 2022, https://openknowledge.worldbank.org/entities/publication/ef01c04f-4417-51b6-8107-b688061a879e.

    [xxi] Steven Lee Myers, “China’s Pledge to Be Carbon Neutral by 2060: What It Means,” The New York Times, September 23, 2020.

    [xxii]  Steven Lee Myers, “China’s Pledge to Be Carbon Neutral by 2060: What It Means,” The New York Times, September 23, 2020.

    [xxiii] Simon Evans, Hongqiao Liu et al, “The Carbon Brief Profile: China,” Carbon Brief, November 30, 2023, https://interactive.carbonbrief.org/the-carbon-brief-profile-china/.

     

    Feature Image: wionews.com  China leads the charge: Beijing develops two-thirds of global wind and solar projects.

     

  • China has achieved escape velocity: it is now unstoppable

    China has achieved escape velocity: it is now unstoppable

    The 21st century is shaping up to be the Asian, Eurasian, and Chinese century.

    While the Hegemon spent at least $7 trillion – and counting – on unwinnable Forever Wars, China is spending $1 trillion in an array of Belt and Road Initiative (BRI) projects across the Global South: the emphasis is digital/transportation connectivity corridors. Geoeconomic imperatives intertwined with rising geopolitical influence.

    The four-day, twice-a-decade plenum of the Communist Party of China that took place last week in Beijing, designing an economic road map all the way to 2029, was a stunning affair in more ways than one.

    Let’s start with continuity – and stability. There’s no question after the plenum that Xi Dada, or The Big Panda, will stay on the helm until 2029 – the end of the current five-year economic drive.

    And if Xi is healthy enough, he will stay until 2035: the fateful and uber-game-changing target year for China to exhibit a GDP per capita of $30,000, with massive worldwide reverberations.

    Here, we see the confluence between the progression of “socialism with Chinese characteristics” and the defining contours, if not of a Pax Sinica, at least of the non-Hegemon-centric, multi-nodal world (italics mine).

    The proverbial U.S. Think Tankland/Sinophobia axis has been hysterical on China not being able to sustain a 5% a year growth rate for the next few years – the target once again stressed at the plenum.

    The Chinese themselves have not bothered about the growth rate for a long time, since in 2018 they switched to a strategy of so-called qualitative development, that is, not at the expense of traditional industries, but on the basis of high technologies and the creation of new areas, such as the production of new energy sources and artificial intelligence.

    A Russian analysis by the Center for Geopolitical Forecasts makes a crucial point: “The Chinese themselves have not bothered about the growth rate for a long time, since in 2018 they switched to a strategy of so-called qualitative development, that is, not at the expense of traditional industries, but on the basis of high technologies and the creation of new areas, such as the production of new energy sources and artificial intelligence.”

    That’s the rationale behind Made in China 2025 – which is being implemented at breakneck speed: high-tech development leading the way towards a “high-level socialist market economy”, to be consolidated by 2025 and fully constructed by 2035.

    The next step will be to attain the status of “modernized socialist power” by 2049, at the 100th anniversary of the People’s Republic of China (PRC).

    The plenum proved once more that “socialism with Chinese characteristics” – or, for the recalcitrant, Chinese-modified capitalism – is “people-centric”. The supreme values are national interest and the people’s interests – attested by the fact that large private corporations remain under the strategic control of the CPC.

    It’s idle to try to find in the final communique at the end of the plenum any restrictions on private capital on the path to “universal prosperity”. The key point is that the role of capital should always be subordinated to the concept of “socialism with Chinese characteristics”.

    Watch the reform ship steadily sailing

    Everything is explained here in nearly didactic terms, chronicling the birth of the “Decision of the CPC Central Committee on further comprehensive deepening of reforms to promote Chinese modernization”.

    What is now already referred to colloquially all across China as “The Decision” spreads across 15 parts and 60 articles, divided into three main sections, proposing more than 300 important reforms.

    “The Decision”, in full, has not yet been published; only the road map of how Beijing planners got there. Of course, this is no mere policy paper; it’s a quintessentially CPC-style dissertation in which the details of economic and political measures are obscured by clouds of images and metaphors.

    Take a look, for instance, at this passage:

    “To ensure that the reform ship sails forward steadily, the ‘Decision’ proposes that further comprehensive deepening of reform must implement the “six principles”: adhere to the party’s overall leadership, adhere to the people-centred approach, adhere to the principle of maintaining the integrity and promoting innovation, adhere to system building as the main line, adhere to the comprehensive rule of law, and adhere to a systematic approach.”

    Most of the “Decision” – 6 parts in a total of 13 – is about economic reform. Will China pull it off? Of course, it will.

    Just look at the precedents. In 1979, the Little Helmsman Deng Xiaoping started to transform a nation of farmers and peasants into a well-oiled machine of efficient industrial workers. Along the way, GDP per capita was multiplied by no less than 30 times.

    Now, the ramifications of Made in China 2025 are turning a nation of factory workers into a nation of engineers. Of 10,5 million university graduates a year, a third are engineers.

    The emphasis on AI has led, among other examples, to the automobile industry being able to produce a $9,000 EV in complete automation and make a profit. China is already a global leader in EVs (BYD building plants in Brazil, Thailand, Turkey, Hungary), solar power, drones, telecom infrastructure (Huawei, ZTE), steel, shipbuilding – and soon, also semiconductors (thank you, Trump sanctions).

    While the Hegemon spent at least $7 trillion – and counting – on unwinnable Forever Wars, China is spending $1 trillion in an array of Belt and Road Initiative (BRI) projects across the Global South: the emphasis is digital/transportation connectivity corridors. Geoeconomic imperatives intertwined with rising geopolitical influence.

    Hegemon hysteria aside, the fact is the Chinese economy will grow by a whopping $1.7 trillion only in 2024. That is more than in all but the last three years – because of the Covid effect.

    And Beijing borrowed exactly zero yuan for this growth. The U.S. economy, by comparison, may grow by $300 billion in 2024, but Washington had to borrow $3.3 trillion for that to happen.

    Researcher Geoff Roberts has compiled a very useful list of what China is doing right.

    And when it comes to the nitty gritty, the numbers are staggering. Here are just a few, apart from GDP growth:

    • Foreign goods trade is up 6.1% to $2.9 trillion year-on-year.
    • The trade surplus is at $85 billion, up 12% compared to 2023.
    • ASEAN trade is up by 10.5% to $80 billion; China is the number one trade partner of individual ASEAN members.
    • China had a record crop of 150 million tons of cereal grains.
    • The courier sector handled 80 billion parcels, up 23% year-on-year.
    • SMIC is the world’s number two pure-play foundry after Taiwan’s TSMC.
    • China Telecom paid $265 million for 23% of QuantumCTek, the patenter of Micius, the world’s first quantum communications satellite.
    • Commercial aerospace launched 39% of China’s 26 rockets.
    • Invention patents rose 43% to 524,000. China is the first country with 4 million domestic invention patents in force.
    • Baidu’s 1,000 robotaxis in Wuhan will break even in Q4 and will be profitable next year.
    • China has 47% of the world’s top AI talent. It added no less than 2000 AI courses to school and college curricula since 2019.
    • On world-class institutions doubling as research leaders, 7 out of 10 are Chinese, including the top one: the Chinese Academy of Sciences, ahead of Harvard.

    Exceptionalist China “experts” believe their own fantasy that the U.S. allied with occupied Japan, Germany and South Korea would be able to match and surpass China’s pull with the Global Majority, because they have more resources and more capital.

    Nonsense. Even more nonsense is to believe that the Hegemon’s NATO “partners” – as in vassals – will follow the leader in creating cutting-edge technology.

    The high-speed train that matters has already left the station. The 21st century is shaping up to be the ‘Asian, Eurasian, and Chinese’ century.

     

    Feature Image Credit: The Diplomat

    The article is republished from the Strategic Culture Foundation.

  • Decoding Quantum Computing: Understanding the Basics

    Decoding Quantum Computing: Understanding the Basics

     

    Quantum computing has the potential to revolutionise the field of computing and has far-reaching implications for the future of technology. It is a complex and rapidly evolving field that requires a deep understanding of quantum mechanics and computer science.

    Quantum Computing and Moore’s Law

    Quantum computing is set to revolutionise the field of computation by leveraging the principles of quantum mechanics. While classical computing, which follows Moore’s Law, is approaching its physical limits, quantum computing offers a way to surpass these boundaries. Moore’s Law states that the number of transistors on a microchip doubles approximately every two years, leading to exponential growth in computing power. However, this trend cannot continue indefinitely due to the physical limitations of classical hardware.

    Nature Simulation with Quantum Processors

    Unlike classical bits, quantum bits (qubits) can exist in multiple states simultaneously, thanks to a property known as superposition. This means that a quantum computer can process a vast number of possibilities all at once. For example, in a maze, a classical computer would explore each path one by one, while a quantum computer could explore all paths simultaneously. This is illustrated in the following diagram:

    Quantum computing exploits entanglement and superposition to perform calculations at unprecedented speeds. This capability makes it particularly suited for simulating natural processes at the atomic and molecular levels, tasks that classical computers struggle with.

    Challenges in Quantum Computing

    Quantum computing, despite its promising potential, encounters notable obstacles primarily stemming from the delicate nature of qubits. Qubits, the fundamental units of quantum information, exhibit high sensitivity to external factors, rendering them susceptible to coherence loss caused by thermal noise. This susceptibility results in increased error rates during computation. Preserving qubit coherence presents a significant challenge, as even minimal disturbances can induce decoherence, disrupting quantum operations.

    In addition to superconducting qubits, other quantum computing methods also face significant challenges. For instance, trapped ion qubits are highly susceptible to environmental noise and require extremely precise laser control to maintain coherence, which is technically demanding and resource-intensive. Topological qubits, while theoretically more robust against local perturbations, are still in nascent stages of experimental realisation, and creating and manipulating these qubits remains a formidable challenge. Photonic qubits rely on maintaining precise control over individual photons, which is difficult due to losses and the need for high-fidelity detectors and sources. Quantum dot qubits face issues with variability in dot size and composition, affecting their uniformity and coherence times. Each of these methods requires sophisticated error correction techniques and significant advancements in material science and engineering to overcome their respective challenges.

    Remarkably, natural quantum processes (Quantum Biology) operate seamlessly at room temperature, a phenomenon that remains elusive in terms of being replicated effectively in artificial quantum systems.

    If these significant technical challenges can be overcome, quantum computing promises unprecedented computational power and transformative applications across various fields.

    Ultimate Applications of Quantum Computing

    Quantum computing holds the promise of facilitating groundbreaking advancements across various disciplines. Research literature underscores its potential in drug discovery, where quantum computers exhibit superior efficacy in modelling intricate molecular structures compared to classical counterparts. Similarly, in financial modelling, quantum algorithms demonstrate the capacity to optimise portfolios with unparalleled precision.

    Military Advancements

    Quantum sensing and communication technologies have the potential to significantly revolutionise military capabilities. Quantum radar systems, for instance, possess the capability to detect stealth aircraft, overcoming the limitations of conventional radar systems. Additionally, secure Quantum communication could provide robust defences against cyber threats, ensuring the integrity and confidentiality of sensitive information.

    Elevating Humanity

    The applications of quantum computing have the potential to propel humanity towards a Type II civilization on the Kardashev Scale, endowed with the capability to harness and manage energy on a planetary scale. By manipulating quantum processes, we stand poised to address pressing global challenges such as climate change and energy scarcity.

    Green Revolution and Sustainability

    Among the most auspicious applications of quantum computing is its potential to revolutionise artificial photosynthesis, thereby paving the way for sustainable energy solutions. Quantum computers are poised to streamline nitrogen capture processes, indispensable for enhancing agricultural productivity and potentially instigating a second green revolution. Such advancements hold the promise of ameliorating food security concerns and accommodating the burgeoning global population, echoing the transformative impact of the initial green revolution.

    How the Race Started

    The Inception and Influence of Peter Shor’s Algorithm

     The quest for quantum supremacy gained significant momentum with the groundbreaking work of Peter Shor, a mathematician and theoretical computer scientist. In 1994, Shor developed an algorithm that fundamentally challenged the security of classical cryptographic systems. Shor’s algorithm, designed to run on a quantum computer, efficiently factors large integers—a task that is exponentially time-consuming for classical computers. This capability poses a direct threat to widely used cryptographic schemes, such as RSA, which rely on the difficulty of factoring large numbers for security.

    Shor’s discovery was a pivotal moment that captured the attention of both the academic community and government agencies, particularly those concerned with national security, such as the National Security Agency (NSA). Recognizing the profound implications for encryption and data security, the NSA and other entities significantly increased their investments in quantum computing research and development.

    This breakthrough ignited international competition, with major world powers like the United States, China, and the European Union vying for dominance in the field. Each nation adopted different technological approaches in their pursuit of quantum supremacy. For example, Google and IBM focus on superconducting qubits, IonQ employs trapped ion technology, and Microsoft explores the potential of topological qubits.

    These diverse methodologies reflect the broad and multifaceted efforts to harness the unprecedented computational power promised by quantum computing.

    Race of the 21st Century

    The quest for quantum supremacy is the new frontier in technological competition, reminiscent of past races like the nuclear arms race (peaking in the 1950s) and the space race1 (culminating in the 1969 moon landing). However, the stakes in the quantum race are arguably higher. Estimates suggest the global quantum computing market could reach $50 billion by 2030. Achieving quantum supremacy, the ability of a quantum computer to outperform a classical computer for a specific task, is not just a scientific milestone but a potential economic and strategic game-changer.

    The country that first achieves and leverages quantum supremacy is poised to become a global leader in innovation, economic growth, and, potentially, military dominance. This potential has spurred fierce international competition, with nations like China, the United States, and the European Union investing heavily in quantum research and development.

    References

    Kaku, Michio. Quantum Supremacy: The Quest to Build the World’s Most Powerful Computer. New York: Doubleday, 2023

    – (2017) “Feeding the World with Die Rolls: Potential Applications of Quantum Computing,” Dartmouth Undergraduate Journal of Science: Vol. 20: No. 1, Article 9.

    Shor algorithm

     Quantum computational chemistry

    Quantum computing research trends report

     

  • Artificial Intelligence vs The Indian Job Market

    Artificial Intelligence vs The Indian Job Market

    Artificial intelligence (AI) has become a ubiquitous presence in our daily lives, transforming the way we operate in the modern era. From the development of autonomous vehicles to facilitating advanced healthcare research, AI has enabled the creation of groundbreaking solutions that were once thought to be unattainable. As more investment is made in this area and more data becomes available, it is expected that AI will become even more powerful in the coming years.

    AI, often referred to as the pursuit of creating machines capable of exhibiting intelligent behaviour, has a rich history that dates back to the mid-20th century. During this time, pioneers such as Alan Turing laid the conceptual foundations for AI. The journey of AI has been marked by a series of intermittent breakthroughs, periods of disillusionment, and remarkable leaps forward. It has also been a subject of much discussion over the past decade, and this trend is expected to continue in the years to come.

    According to a report by Precedence Research, the global artificial intelligence market was valued at USD 454.12 billion in 2022 and is expected to hit around USD 2,575.16 billion by 2032, progressing with a compound annual growth rate (CAGR) of 19% from 2023 to 2032. The Asia Pacific is expected to be the fastest-growing artificial intelligence market during the forecast period, expanding at the highest CAGR of 20.3% from 2023 to 2032. The rising investments by various organisations towards adopting artificial intelligence are boosting the demand for artificial intelligence technology.[1]

    Figure 1 illustrates a bar graph displaying the upward trajectory of the AI market in recent years, sourced from Precedence Research.

    The Indian government has invested heavily in developing the country’s digital infrastructure. In 2020, The Government of India increased its spending on Digital India to $477 million to boost AI, IoT, big data, cyber security, machine learning, and robotics. The artificial intelligence market is expected to witness significant growth in the BFSI(banking, financial services, and insurance) sectors on account of data mining applications, as there is an increase in the adoption of artificial intelligence solutions in data analytics, fraud detection, cybersecurity, and database systems.

    Figure 2 illustrates a pie chart displaying the distribution of the Artificial Intelligence (AI) market share across various regions in 2022, sourced from Precedence Research.

    Types of AI Systems and Impact on Employment

    AI systems can be divided primarily into three types:

    Narrow AI: This is a specific form of artificial intelligence that executes dedicated tasks with intelligence. It represents the prevailing and widely accessible type of AI in today’s technological landscape.

    General AI: This represents an intelligence capable of efficiently undertaking any intellectual task akin to human capabilities. Aspiration driving the development of General AI revolves around creating a system with human-like cognitive abilities that enables autonomous, adaptable thinking. However, as of now, the realisation of a General AI system that comprehensively emulates human cognition remains elusive.

    Super AI: It is a level of intelligence within systems where machines transcend human cognitive capacities, exhibit superior performance across tasks, and possess advanced cognitive properties. This extends from the culmination of the General AI.

    Artificial intelligence has been incorporated into various aspects of our lives, ranging from virtual assistants on our mobile devices to advancements in customisation, cyber protection, and more. The growth of these systems is swift, and it is only a matter of time before the emergence of general artificial intelligence becomes a reality.

    According to a report by PwC, the global GDP is estimated to be 14% higher in 2030 due to the accelerating development and utilisation of AI, which translates to an additional $15.7 trillion. This growth can be attributed to:

    1. Improvements in productivity resulting from the automation of business processes (including the use of robots and autonomous vehicles).
    2. Productivity gains from businesses integrating AI technologies into their workforce (assisted and augmented intelligence).
    3. Increased consumer demand for AI-enhanced products and services, resulting in personalised and/or higher-quality offerings.

    The report suggests that the most significant economic benefits from AI will likely come from increased productivity in the near future. This includes automating mundane tasks, enhancing employees’ capabilities, and allowing them to focus on more stimulating and value-added work. Capital-intensive sectors such as manufacturing and transport are likely to experience the most significant productivity gains from AI, given that many operational processes in these industries are highly susceptible to automation. (2)

    AI will disrupt many sectors and lead to the creation of many more. A compelling aspect to observe is how the Indian Job Market responds to AI and its looming threat to job security in the future.

    The Indian Job Market

    As of 2021, around 487.9 million people were part of the workforce in India out of 950.2 million people aged 15-64, the second largest after China. While there were 986.5 million people in China aged 15-64, there were 747.9 million people were part of the workforce.

    India’s labour force participation rate (LFPR) at 51.3 per cent was less than China’s 76 per cent and way below the global average of 65 per cent.[3]

    The low LFPR can be primarily attributed to two reasons:

    Lack of Jobs

    To reach its growth potential, India is expected to generate approximately 9 million nonfarm jobs annually until 2030, as per a report by McKinsey & Company. However, analysts suggest that the current rate of job creation falls significantly below this target, with only about 2.9 million nonfarm jobs being added each year from 2013 to 2019. [4]

    During the COVID-19 pandemic, urban unemployment in India surged dramatically, peaking at 20.9% in the April-June 2020 quarter, coinciding with wage decline. Although the unemployment rate has decreased since then, full-time employment opportunities are scarce. Economists highlight a concerning trend where an increasing number of job-seekers, particularly the younger demographic, are turning towards low-paying casual jobs or opting for less stable self-employment options.[5]

     This shift in employment pattern occurs alongside a broader outlook for the Indian economy, which is projected to achieve an impressive growth rate of 6.5% by the fiscal year ending in March 2025. Despite this optimistic growth forecast, the employment landscape appears to be evolving, leading individuals towards less secure and lower-paying work options. This shift raises pertinent concerns about the job market’s quality, stability, and inclusivity, particularly in accommodating the aspirations and needs of India’s burgeoning young workforce.

    Low female labour participation

    In 2021, China boasted an estimated female population of 478.3 million within the 15-64 age bracket, with an active female labour force of approximately 338.6 million. In stark contrast, despite India having a similar demographic size of 458.2 million women in that age group, its female labour force was significantly smaller, numbering only 112.8 million.[6]

    This discrepancy underscores a notable disparity in India’s female labour force participation rate compared to China, despite both countries having sizeable female populations within the working-age bracket.[7]

    Along with unemployment, there was also a crisis of under-employment and the collapse of small businesses, which has worsened since the pandemic.

    AI vs the Indian Job Market

    The presence and implications of AI cast a significant shadow on a country as vast and diverse as India. Amidst the dynamic and often unpredictable labour market, where employment prospects have been uncertain, addressing the impact of AI poses a considerable challenge for employers. Balancing the challenges and opportunities presented by AI while prioritising job security for the workforce is a critical obstacle to overcome.

     The diverse facets of artificial intelligence (AI) and its capacity to transform industries across the board amplify the intricacy of the employment landscape in India. Employers confront the formidable challenge of devising effective strategies to incorporate AI technologies without compromising the livelihoods of their employees.

    As per the findings of the Randstad Work Monitor Survey, a staggering 71% of individuals in India exhibit an inclination towards altering their professional circumstances within the next six months, either by transitioning to a new position within the same organisation or by seeking employment outside it. Furthermore, 23% of the workforce can be classified as passive job seekers, who are neither actively seeking new opportunities nor applying for them but remain open to considering job prospects if a suitable offer arises.

    It also stated that at least half of Indian employees fear losing their jobs to AI, whereas the figure is one in three in developed countries. The growing concern among Indian workers stems from the substantial workforce employed in Business Process Outsourcing (BPO) and Knowledge Process Outsourcing (KPO), which are notably vulnerable to AI automation. Adding to this concern is India’s rapid uptake of AI technology, further accentuating the apprehension among employees.[8]

    India’s role as a global hub for outsourcing and its proficiency in delivering diverse services have amplified the impact of AI adoption. The country has witnessed a swift embrace of AI technologies across various industries, magnifying workers’ concerns regarding the potential ramifications of their job security.

    Goldman Sachs’ report highlights the burgeoning emergence of generative artificial intelligence (AI) and its potential implications for labour dynamics. The rapid evolution of this technology prompts questions regarding a possible surge in task automation, leading to cost savings in labour and amplified productivity. [9]

    The labour market could confront significant disruptions if generative AI delivers its pledged capabilities. Analysing occupational tasks across the US and Europe revealed that approximately two-thirds of the current jobs are susceptible to AI automation. Furthermore, the potential of generative AI to substitute up to one-fourth of existing work further underscores its transformative potential.

     Expanding these estimates on a global scale suggests that generative AI might expose the equivalent of 300 million full-time jobs to automation, signifying the far-reaching impact this technology could have on global labour markets.

    Recent advancements in artificial intelligence (AI) and machine learning have exerted substantial influence across various professions and industries, particularly impacting job landscapes in sectors such as Indian IT, ITeS, BPO, and BPM. These sectors collectively employ over five million people and are India’s primary source of white-collar jobs. [10]

    In a recent conversation with Business Today, Vardhman Jain, the founder and Vice Chairman of Access Healthcare, a Chennai-based BPO, highlighted the forthcoming impact of AI integration on the workplace. Jain indicated that AI implementation may cause customer service to be the sector most vulnerable to initial disruptions.

    Jain pointed out that a substantial portion of services provided by the Indian BPO industry is focused on customer support, including voice and chat functions, data entry, and back-office services. He expounded upon how AI technologies, such as Natural Language Processing, Machine Learning, and Robotic Process Automation, possess the potential to significantly disrupt and automate these tasks within the industry.

    While the discourse surrounding AI often centres on the potential for job displacement, several industry leaders argue that AI will not supplant human labour, but rather augment worker output and productivity.

    At the 67th Foundation Day celebration of the All-India Management Association (AIMA), NR Narayan Murthy, as reported by Business Today, conveyed a noteworthy message by asserting that AI is improbable to supplant human beings, as humans will not allow it to happen.

    Quoting Murthy’s statement from the report, “I think there is a mistaken belief that artificial intelligence will replace human beings; human beings will not allow artificial intelligence to replace them.” The Infosys founder stressed that AI has functioned as an assistive force rather than an outright replacement, enhancing human lives and making them more comfortable.[11]

    McKinsey Global Institute’s study, “Generative AI and the Future of Work in America,” highlighted AI’s capability to expedite economic automation significantly. The report emphasised that while generative AI wouldn’t immediately eliminate numerous jobs, it would enhance the working methods of STEM, creative, business, and legal professionals.[12]

     However, the report also underscored that the most pronounced impact of automation would likely affect job sectors such as office support, customer service, and food service employment.

    While the looming threats posed by AI are undeniable, its evolution is expected to usher in a wave of innovation, leading to the birth of new industries and many job opportunities. This surge in new industries promises employment prospects and contributes significantly to economic growth by leveraging AI capabilities.

    Changing employment Landscape

    Having explored different perspectives and conversations on AI, it has become increasingly evident that the employment landscape is poised for significant transformation in the years ahead. This prompts a crucial enquiry: Will there remain a necessity for human jobs, and are our existing systems equipped to ensure equitable distribution of the benefits fostered by this technology developments?

    • Universal Basic Income

    Universal basic income (UBI) is a social welfare proposal in which all citizens of a given population regularly receive minimum income in the form of an unconditional transfer payment, that is, without a means test or need to work, in which case it would be called guaranteed minimum income.

    Supporters of Universal Basic Income (UBI) now perceive it not only as a solution to poverty, but also as a potential answer to several significant challenges confronting contemporary workers: wage disparities, uncertainties in job stability, and the looming spectre of job losses due to advancements in AI.

    Karl Widerquist, a professor of philosophy at Georgetown University-Qatar and an economist and political theorist, posits that the influence of AI on employment does not necessarily result in permanent unemployment. Instead, he suggests a scenario in which displaced workers shift into lower-income occupations, leading to increased competition and saturation in these sectors.

    According to Widerquist, the initial effects of AI advancements might force white-collar workers into the gig economy or other precarious and low-paying employment. This shift, he fears, could trigger a downward spiral in wages and job security, exacerbating economic inequality.

     He advocates for a Universal Basic Income (UBI) policy as a response to the challenges posed by AI and automation. Widerquist argues that such a policy would address employers’ failure to equitably distribute the benefits of economic growth, fuelled in part by automation, among workers. He sees UBI as a potential solution to counter the widening disparity in wealth distribution resulting from these technological advancements.[13]

    A study conducted by researchers at Utrecht University, Netherlands, from 2017 to 2019 led to the implementation of basic income for unemployed individuals who previously received social assistance. The findings showcase an uptick in labour market engagement. This increase wasn’t solely attributed to the financial support offered by Universal Basic Income (UBI) but also to removing conditions—alongside sanctions for non-compliance—typically imposed on job seekers.[14]

    Specifically, participants exempted from the obligation to actively seek or accept employment demonstrated a higher likelihood of securing permanent contracts, as opposed to the precarious work arrangements highlighted by Widerquist.

     While UBI experiments generally do not demonstrate a significant trend of workers completely exiting the labour market, instances of higher payments have resulted in some individuals reducing their working hours. This nuanced impact showcases the varying effects of UBI on labour participation, highlighting both increased job security for some and a choice for others to adjust their work hours due to enhanced financial stability.

    In exploring the potential for Universal Basic Income (UBI), it becomes evident that while the concept holds promise, its implementation and efficacy are subject to multifaceted considerations. The diverse socioeconomic landscape, coupled with the scale and complexity of India’s population, presents both opportunities and challenges for UBI.

     UBI’s potential to alleviate poverty, enhance social welfare, and address economic disparities in a country as vast and diverse as India is compelling. However, the feasibility of funding such a program, ensuring its equitable distribution, and navigating its impact on existing welfare schemes requires careful deliberation.

    Possible Tax Solutions

    • Robot Tax

    The essence of a robot tax lies in the notion that companies integrating robots into their operations should bear a tax burden given that these machines replace human labour.

     There exist various arguments advocating for a robot tax. Initially, it aimed to safeguard human employment by dissuading firms from substituting humans with robots. Additionally, while companies may prefer automation, imposing a robot tax can generate government revenue to offset the decline in funds from payroll and income taxes. Another crucial argument favouring this tax is rooted in allocation efficiency: robots neither contribute to payroll nor income taxes. Taxing robots at a rate similar to human labour aligns with economic efficiency to prevent distortions in resource allocation.

    In various developed economies, such as the United States, the prevailing taxation system presents a bias toward artificial intelligence (AI) and automation over human workforce. This inclination, fueled by tax incentives, may lead to investments in automation solely for tax benefits rather than for the actual potential increase in profitability. Furthermore, the failure to tax robots can exacerbate income inequality as the share of labor in national income diminishes.

    One possible solution to address this issue is the implementation of a robot tax, which could generate revenue that could be redistributed as Universal Basic Income (UBI) or as support for workers who have lost their jobs due to the adoption of robotic systems and AI and are unable to find new employment opportunities.

    • Digital Tax

    The discourse surrounding digital taxation primarily centers on two key aspects. Firstly, it grapples with the challenge of maintaining tax equity between traditional and digital enterprises. Digital businesses have benefited from favorable tax structures, such as advantageous tax treatment for income derived from intellectual property, accelerated amortization of intangible assets, and tax incentives for research and development. However, there is a growing concern that these preferences may result in unintended tax advantages for digital businesses, potentially distorting investment trajectories instead of promoting innovation.

    Secondly, the issue arises from digital companies operating in countries with no physical presence yet serving customers through remote sales and service platforms. This situation presents a dilemma regarding traditional corporate income tax regulations. Historically, digital businesses paid corporate taxes solely in countries where they maintained permanent establishments, such as headquarters, factories, or storefronts. Consequently, countries where sales occur or online users reside have no jurisdiction over a firm’s income, leading to taxation challenges.

    Several approaches have been suggested to address the taxation of digital profits. One approach involves expanding existing frameworks, for instance, a country may extend its Value-Added Tax (VAT) or Goods and Services Tax (GST) to encompass digital services or broaden the tax base to include revenues generated from digital goods and services. Alternatively, there is a need to implement a separate Digital Service Tax (DST).

    While pinpointing the ultimate solution remains elusive, ongoing experimentation and iterative processes are expected to guide us toward a resolution that aligns with the need for a larger consensus. With each experiment and accumulated knowledge, we move closer to uncovering an approach that best serves the collective requirements.[15]

    Reimagining the Future

    The rise of Artificial Intelligence (AI) stands as a transformative force reshaping the industry and business landscape. As AI continues to revolutionise how we work and interact, staying ahead in this rapidly evolving landscape is not just an option, but a necessity. Embracing AI is not merely about adapting to change; it is also about proactive readiness and strategic positioning. Whether you’re a seasoned entrepreneur or a burgeoning startup, preparing for the AI revolution involves a multifaceted approach encompassing automation, meticulous research, strategic investment, and a keen understanding of how AI can augment and revolutionise your business. PwC’s report lists some crucial steps to prepare one’s business for the future and stay ahead. [16]

    Understand AI’s Impact: Start by evaluating the industry’s technological advancements and competitive pressure. Identify operational challenges AI can address, disruptive opportunities available now and those on the horizon.

    Prioritise Your Approach: Determine how AI aligns with business goals. Assess your readiness for change— are you an early adopter or follower? Consider feasibility, data availability, and barriers to innovation—Prioritise automation and decision augmentation processes based on potential savings and data utilisation.

    Talent, Culture, and Technology: While AI investments might seem high, costs are expected to decrease over time. Embrace a data-driven culture and invest in talent like data scientists and tech specialists. Prepare for a hybrid workforce, combining AI’s capabilities with human skills like creativity and emotional intelligence.

    Establish Governance and Trust: Trust and transparency are paramount. Consider the societal and ethical implications of AI. Build stakeholder trust by ensuring AI transparency and unbiased decision-making. Manage data sources rigorously to prevent biases and integrate AI management with overall technology transformation.

     Getting ready for Artificial Intelligence (AI) is not just about new technology; it is an intelligent strategy. Understanding how AI fits one’s goals is crucial; prioritising where it can help, building the right skills, and setting clear rules are essential. As AI becomes more common, it is not about robots taking over, but humans and AI working together. By planning and embracing AI wisely, businesses can stay ahead and create innovative solutions in the future.

    References:

    [1] Precedence Research. “Artificial Intelligence (AI) Market.” October 2023. Accessed November 14, 2023. https://www.precedenceresearch.com/artificial-intelligence-market

    [2] Pricewaterhouse Coopers (PwC). “Sizing the prize, PwC’s Global Artificial Intelligence Study.” October 2017. Accessed November 14, 2023. https://www.pwc.com/gx/en/issues/data-and-analytics/publications/artificial-intelligence-study.html#:~:text=The%20greatest%20economic%20gains%20from,of%20the%20global%20economic%20impact.

    [3] World Bank. “Labor force, total – India 2021.” Accessed November 12, 2023. https://data.worldbank.org/indicator/SL.TLF.TOTL.IN?locations=IN

    [4] McKinsey & Company. “India’s Turning Point.” August 2020. https://www.mckinsey.com/~/media/McKinsey/Featured%20Insights/India/Indias%20turning%20point%20An%20economic%20agenda%20to%20spur%20growth%20and%20jobs/MGI-Indias-turning-point-Executive-summary-August-2020-vFinal.pdf

    [5] Dugal, Ira. “Where are the jobs? India’s world-beating growth falls short.” Reuters, May 31, 2023. Accessed November 14, 2023. https://www.reuters.com/world/india/despite-world-beating-growth-indias-lack-jobs-threatens-its-young-2023-05-30/

    [6] Government of India. Ministry of Labour and Employment. “Labour and Employment Statistics 2022.” July 2022. https://dge.gov.in/dge/sites/default/files/2022-08/Labour_and_Employment_Statistics_2022_2com.pdf

    [7] Deshpande, Ashwini, and Akshi Chawla. “It Will Take Another 27 Years for India to Have a Bigger Labour Force Than China’s.” The Wire, July 27, 2023. https://thewire.in/labour/india-china-population-labour-force

    [8] Randstad. “Workmonitor Pulse Survey.” Q3 2023. https://www.randstad.com/workforce-insights/future-work/ai-threatening-jobs-most-workers-say-technology-an-accelerant-for-career-growth/

    [9] Briggs, Joseph, and Devesh Kodnani. “The Potentially Large Effects of Artificial Intelligence on Economic Growth.” Goldman Sachs, March 26, 2023. https://www.key4biz.it/wp-content/uploads/2023/03/Global-Economics-Analyst_-The-Potentially-Large-Effects-of-Artificial-Intelligence-on-Economic-Growth-Briggs_Kodnani.pdf

    [10] Chaturvedi, Aakanksha. “‘Might take toll on low-skilled staff’: How AI can cost BPO, IT employees their jobs.” Business Today, April 5, 2023. https://www.businesstoday.in/latest/corporate/story/might-take-toll-on-low-skilled-staff-how-ai-can-cost-bpo-it-employees-their-jobs-376172-2023-04-05

    [11] Sharma, Divyanshi. “Can AI take over human jobs? This is what Infosys founder NR Narayan Murthy thinks.” India Today, February 27, 2023. https://www.indiatoday.in/technology/news/story/can-ai-take-over-human-jobs-this-is-what-infosys-founder-nr-narayan-murthy-thinks-2340299-2023-02-27

    [12] McKinsey Global Institute. “Generative AI and the future of work in America.” July 26, 2023. https://www.mckinsey.com/mgi/our-research/generative-ai-and-the-future-of-work-in-america

    [13] Kelly, Philippa. “AI is coming for our jobs! Could universal basic income be the solution?” The Guardian, November 16, 2022. https://www.theguardian.com/global-development/2023/nov/16/ai-is-coming-for-our-jobs-could-universal-basic-income-be-the-solution

    [14] Utrecht University. “What works (Weten wat werkt).” March 2020. https://www.uu.nl/en/publication/final-report-what-works-weten-wat-werkt

    [15] Merola, Rossana. “Inclusive Growth in the Era of Automation and AI: How Can Taxation Help?” *Frontiers in Artificial Intelligence* 5 (2022). Accessed November 23, 2023. https://www.frontiersin.org/articles/10.3389/frai.2022.867832

    [16]  Rao, Anand. “A Strategist’s Guide to Artificial Intelligence.” PwC, May 10, 2017.https://www.strategy-business.com/article/A-Strategists-Guide-to-Artificial-Intelligence

     

  • Our Nearest Neighbours

    Our Nearest Neighbours

    In anticipation of a holiday gift, I kept asking members of my research team every week whether they noticed any anomalous object among the nearly hundred thousand objects imaged by the Galileo Project Observatory at Harvard University over the past couple of months. The reason is simple.

    Finding a package from a neighbour among familiar rocks in our backyard is an exciting event. So is the discovery of a technological object near Earth that was sent from an exoplanet. It raises the question: which exoplanet? As a follow-up on such a finding, we could search for signals coming from any potential senders, starting from the nearest houses on our cosmic street.

    Summer Triangle, which consists of the three of the brightest stars in the sky–Vega, Deneb, and Altair. The Summer Triangle is high overhead throughout the summer, and it sinks lower in the west as fall progresses. For this star hop, start from brilliant blue-white Vega (magnitude 0), the brightest of the three stars of the Summer Triangle.
    From Vega, look about 15 degrees west for the distinctive 4-sided figure in the centre of Hercules known as the keystone. On the north side of the keystone, imagine a triangle pointing to the north, with the tip of the triangle slightly shifted toward Vega (as shown in the chart below). This is the location of M92.

    The opportunity for a two-way communication with another civilization during our lifetime is limited to a distance of about thirty light years. How many exoplanets reside in the habitable zone of their host star? This zone corresponds to a separation where liquid water could exist on the surface of an Earth-mass rock with an atmosphere. Also known as the Goldilocks’ zone, this is the separation where the temperature is just right, not too cold for liquid water to solidify into ice, and not too hot for liquid water to vaporize.

    So far, we know of a dozen habitable exoplanets within thirty light years (abbreviated hereafter as `ly’) from Earth. The nearest among them is Proxima Centauri b, at a distance of 4.25 ly. Farther away are Ross 128b at 11 ly; GJ 1061c and d at 11.98 ly, Luyten’s Star b at 12.25 ly, Teegarden’s Star b and c at 12.5 ly, Wolf 1061c at 14 ly, GJ 1002b and c at 15.8 ly, Gliese 229Ac at 18.8 ly, and planet c of Gliese 667 C at 23.6 ly. These confirmed planets have an orbital period that ranges between a week to a month, much shorter than a year because their star is fainter than the Sun. This list must be incomplete because two-thirds of the count is within a distance of 15 ly whereas the volume out to 30 ly is 8 times bigger. Given that the nearest habitable Earth-mass exoplanet is at 4.25 ly, there should be of order four hundred similar planets within 30 ly. We are only aware of a few per cent of them.

    But even if we identified all the nearby candidate planets for a two-way conversation, they would constitute a tiny fraction of the tens of billions of habitable planets within the Milky Way galaxy. Having any of the nearby candidates host a communicating civilization would imply statistically an unreasonably large population of transmitting civilizations for SETI surveys.

    Most likely, any visiting probe we encounter had originated tens of thousands of light-years away. In that case, we will not be able to converse with the senders during our lifetime. Instead, we will need to infer their qualities from their probes, similarly to the prisoners in Plato’s Allegory of the Cave, who attempt to infer the nature of objects behind them based on the shadows they cast on the cave walls.

    It is better not to imagine your neighbours before meeting them because they might be very different than anticipated. My colleague Ed Turner from Princeton University, used to say that the more time he spends in Japan, the less he understands the Japanese culture. According to Ed, visiting Japan is the closest he ever got to meeting extraterrestrials. My view is that an actual encounter with aliens or their products would be far stranger than anything we find on Earth.

    Personally, I am inspired by the stars because they might be home to neighbours from whom we can learn. The stars in the sky look like festive lights on a Christmas tree which lasts billions of years. A few days ago, a woman coordinated dinner with me as a holiday gift to her husband, who follows my work. At the end of dinner, they gave me a large collection of exceptional Japanese chocolates, which I will explore soon. In return, I autographed my two recent books on extraterrestrials for their kids with the hope that they would inherit my fascination with the stars.

    Here’s hoping that our children will have the opportunity to correspond with the senders of an anomalous object near Earth. During this holiday season, I wish for a Messianic age of peace and prosperity for all earthlings as a result of the encounter with this gift.

     

    Feature Image Credit: Messier 92 is one of two beautiful globular clusters in Hercules, the other being the famous M13. Although M92 is not quite as large and bright as M13, it is still an excellent sight in a medium to large telescope, and it should not be overlooked. The cluster is about 27,000 light years away and contains several hundred thousand stars. www.skyledge.net

    Other Two Pictures in Text: www.skyledge.net

    This article was published earlier in medium.com

  • Is Singularity here?

    Is Singularity here?

    One of the most influential figures in the field of AI, Ray Kurzweil, has famously predicted that the singularity will happen by 2045. Kurzweil’s prediction is based on his observation of exponential growth in technological advancements and the concept of “technological singularity” proposed by mathematician Vernor Vinge.

    The term Singularity alludes to the moment in which artificial intelligence (AI) becomes indistinguishable from human intelligence. Ray Kurzweil, one of AI’s fathers and top apologists, predicted in 1999 that Singularity was approaching (Kurzweil, 2005). In 2011, he even provided a date for that momentous occasion: 2045 (Grossman, 2011). However, in a book in progress, initially estimated to be released in 2022 and then in 2024, he announces the arrival of Singularity for a much closer date: 2029 (Kurzweil, 2024). Last June, though, a report by The New York Times argued that Silicon Valley was confronted by the idea that Singularity had already arrived (Strifeld, 2023). Shortly after that report, in September 2023, OpenAI announced that ChatGPT could now “see, hear and speak”. That implied that generative artificial intelligence, meaning algorithms that can be used to create content, was speeding up.

     Is, thus, the most decisive moment in the history of humankind materializing before our eyes? It is difficult to tell, as Singularity won’t be a big noticeable event like Kurzweil suggests when given such precise dates. It will not be a discovery of America kind of thing. On the contrary, as Kevin Kelly argues, AI’s very ubiquity allows its advances to be hidden. However, silently, its incorporation into a network of billions of users, its absorption of unlimited amounts of information and its ability to teach itself, will make it grow by leaps and bounds. And suddenly, it will have arrived (Kelly, 2017).

    The grain of wheat and the chessboard

             What really matters, though, is the gigantic gap that will begin taking place after its arrival. Locked in its biological prison, human intelligence will remain static at the point where it was reached, while AI shall keep advancing at exponential speed. As a matter of fact, the human brain has a limited memory capacity and a slow speed of processing information: About 10 Hertz per second (Cordeiro, 2017.)  AI, on its part, will continue to double its capacity in short periods of time. This is reminiscent of the symbolic tale of the grain of wheat and the chess board, which takes place in India. According to the story, if we place one grain of wheat in the first box of the chess board, two in the second, four in the third, and the number of grains keeps doubling until reaching box number 64, the total amount of virtual grains on the board would exceed 18 trillion grains (IntoMath). The same will happen with the advance of AI.

    The initial doublings, of course, will not be all that impressive. Two to four or four to eight won’t say much. However, according to Ray Kurzweil, the moment of transcendence would come 15 years after Singularity itself, when the explosion of non-human intelligence should have become overwhelming (Kurzweil, 2005). But that will be only the very beginning. Centuries of progress would be able to materialize in years or even months. At the same time, though, centuries of regression in the relevance of the human race could also occur in years or even months.

    Humans equaling chickens

             As Yuval Noah Harari points out, the two great attributes that separate homo sapiens from other animal species are intelligence and the flow of consciousness. While the first has allowed humans to become the owners of the planet, the second gives meaning to human life. The latter translates into a complex interweaving of memories, experiences, sensations, sensitivities, and aspirations: meaning, the vital expressions of a sophisticated mind. According to Harari, though, human intelligence will be utterly negligible compared to the levels to be reached by AI. In contrast, the flow of consciousness will be an expression of capital irrelevance in the face of algorithms’ ability to penetrate the confines of the universe. Not in vain, in his terms, human beings will be to AI the equivalent of what chickens are for human beings (Harari, 2016).

             Periodically, humanity goes through transitional phases of immense historical significance that shake everything on its path. During these, values, beliefs and certainties are eroded to their foundations and replaced by new ones. All great civilizations have had their own experiences in this regard. In the case of the Western World, there have been three significant periods of this kind in the last six hundred years: The Renaissance that took place in the 15th and 16th centuries, the Enlightenment of the 18th, and Modernism that began at the end of the 19th century and reached its peak in the 20th.

    Renaissance, Enlightenment and Modernism

    The Renaissance is understood as a broad-spectrum movement that led to a new conception of the human being, transforming it in the measure of all things. At the same time, it expressed a significant leap in scientific matters where, beyond great advances in several areas, the Earth ceased to be seen as the centre of the universe. The Enlightenment placed reason as the defining element of society. Not only in terms of the legitimacy of political power but also as the source of liberal ideals such as freedom, progress, or tolerance. It was, concurrently, the period in which the notion of harmony was projected into all orders, including the understanding of the universe. During this time, the scientific method began to be supported by verification and evidence. Enlightenment represented a new milestone in the self-gratifying vision human beings had of themselves.

    Modernism, understood as a movement of movements, overturned prevailing paradigms in almost all areas of existence. Among its numerous expressions were abstract art in its multiple variables, an introspective narrative that gave a free run to the flow of consciousness, and psychoanalysis, the theatre of the absurd. In sum, reason and harmony were turned upside down at every step. Following its own dynamic but feeding back the former, science toppled down the pillars of certainty. This included the conception of the universe built by Newton during the Enlightenment. The conventional notions of time and space lost all meaning under the theory of Relativity while, going even further, quantum physics made the universe a place dominated by randomness. Unlike the previous two periods of significant changes, Modernism eroded to its bones the self-gratifying vision human beings had of themselves.

    The end of human centrality

             Renaissance, Enlightenment and Modernism unleashed and symbolized new ways of perceiving the human being and the universe surrounding him. Each of these movements placed humanity before new levels of consciousness (including the subconscious during Modernism). In each of them, humans could feel themselves more or less valued, more secure or insecure with respect to their own condition and its position in relation to the universe itself. However, a fundamental element was never altered: Humans always studied themselves and their surroundings. Even while questioning their nature and motives, they reaffirmed their centrality within the planet. As it had been defined since the Renaissance, humans kept being the measure of all Earthly things.

    Singularity, however, is called to destroy that human centrality in a radical, dramatic, and irreversible way. As a result, human beings will not only confront its obsolescence and irrelevance but will embark on the path towards becoming equals to chickens. Everything previously experienced in the march of human development, including the three above-mentioned groundbreaking periods, will pale drastically by comparison.

    The countdown towards the end

             We are, thus, within the countdown towards the henhouse grounds. Or worse still, towards the destruction of the human race itself. That is what Stephen Hawking, one of the most outstanding scientists of our time, believed would result from the advent of AI’s dominance. This is also what hundreds of top-level scientists and CEOs of high-tech companies felt when, in May 2023, they signed an open letter warning about the risk to human subsistence involved in an uncontrolled AI. For them, the risk for humanity associated with this technology was on par with those of a nuclear war or a devastating human pandemic. Furthermore, at a “summit” of bosses of large corporations held at Yale University in mid-June this year, 42 percent indicated that AI could destroy humanity in five to ten years (Egan, 2023).

    The risk for humanity associated with AI technology was on par with those of a nuclear war or a devastating human pandemic. At a “summit” of bosses of large corporations held at Yale University in mid-June this year, 42 percent indicated that AI could destroy humanity in five to ten years.

             In the short to medium term, although at the cost of increasing and massive unemployment, AI will spurt gigantic advances in multiple fields. Inevitably, though, at some point, this superior intelligence will escape human control and pursue its own ends. This may happen if freed from the “jail” imposed by its programmers by some interested hand. The natural culprits of these actions would come from what Harari labels as the community of experts. Among its members, many believe that if humans can no longer control the overwhelming volumes of information available, the logical solution is to pass the commanding torch to AI (Harari, 2016). The followers of the so-called Transhumanist Party in the United States represent a perfect example of this. They aspire to have a robot as President of that country within the next decade (Cordeiro, 2017). However, AI might be able to free itself of human constraints without any external help. Along the road, its own self-learning process would certainly allow so. One way or the other, when this happens, humanity will be doomed.

             As a species, humans do not seem to have much of an instinct for self-preservation. If nuclear war or climate change doesn’t get rid of us, AI will probably take care of it. The apparently imminent arrival of Singularity, thus, should be seen with frightful eyes.

    References

    Cordeiro, José Luis (2017). “En 2045 asistiremos a la muerte de la muerte”. Conversando con Gustavo Núñez, AECOC, noviembre.

    Egan, Matt (2023). “42% of CEOs say AI could destroy humanity in five to ten years”, CNN Business, June 15.

    Harari, Yuval Noah (2016). Homo Deus. New York: Harper Collins.

    Grossman, Lev (2011) “2045: The Year the Man Becomes Inmortal”, Time, February 10.

    IntoMath, “The Wheat and the Chessboard: Exponents.

    Kelly, Kevin (2017). The Inevitable. New York: Penguin Books.

    Kurzweil, Ray (2005). The Singularity is Near. New York: Viking Books.

    Kurzweil, Ray (2024). The Singularity is Nearer. New York: Penguin Random House.

    Streifeld, David (2023). “Silicon Valley Confronts the Idea that Singularity is Here”, The New York Times, June 11.

    Feature Image Credit: Technological Singularity https://kardashev.fandom.com

    Text Image: https://ts2.space

  • Seabed: “to mine or not to mine”

    Seabed: “to mine or not to mine”

    Seabed mining offers new vistas for business partnerships and joint ventures among different industries in the offshore mining supply chains.

    The month-long debate “to mine or not to mine” has ended inconclusively at the 28th session of the International Seabed Authority (ISA) Assembly from 28 June to 28 July 2023 in Kingston, Jamaica amid calls for a “ban /suspension/precautionary pause” on any extractive activities.

    Figure Credit: eandt.theiet.org

    The ‘naysayers’ vehemently argued for the protection of the oceans given that these large bodies of water are already experiencing multiple and diverse nature and human-induced challenges such as climate change, unsustainable fishing, marine pollution etc. Furthermore, any attempt to mine the seabed will have far-reaching adverse impacts on marine life and result in biodiversity loss keeping in mind that human knowledge about the deep sea ecosystems is very little.

    Those in favour of seabed mining attempted to convince that energy transition is critical for sustainable development and for that a sustained supply of nickel, manganese, cobalt, and copper, is inescapable. These metals/minerals would have to be sourced from the seabed. For the time being, the representatives of the ISA Member States and other stakeholders have returned home to mull over the issue of seabed mining.

    The sudden hyper-activity at the ISA is a result of the June 2021 submission by Nauru, a Pacific island nation which submitted an application for approval from the ISA to commence extraction activities relying on the “two-year rule,” under which the “Council shall complete the adoption of the relevant rules, regulations, and procedures (RRPs) within two years from the submission”. The two-year deadline expired on 9 July 2023, but the ISA Council, a 36-member body executive arm responsible for approving contracts with private corporations and government entities, among other things, announced that it would “continue the negotiations on the draft exploitation regulations”.

    Meanwhile, at home, the Government of India is all set to exploit oceanic resources. Earlier this month, the Indian Parliament (Rajya Sabha and Lok Sabha) passed the Offshore Areas Mineral (Development and Regulation) Amendment Bill 2023 which enables extraction activities in offshore areas for mineral resources.

    It is true that offshore resource development has been a much-neglected area other than the oil and gas sectors. This is notwithstanding the seminal contributions made by the Geological Survey of India (GSI) which has been leading offshore scientific research and survey activities since the early sixties. The Marine and Coastal Survey Division (MCSD) of the GSI conducts numerous related activities including seabed mapping and exploration within the Indian EEZ and is supported by three ocean-going vessels.

    According to the GSI, as of January 2023, nearly 95 % of India’s EEZ of 2.159 million square kilometres has been surveyed. Since 2022, the GSI has been carrying our seabed mapping in international waters and has covered over 70,000 square kilometres till December 2023 for “generation of baseline data along with the search for possible mineral occurrences in the Ninety East Ridge near the Equator, Indian Ocean and the Laxmi Basin (Block-I, II and III), Arabian Sea by deploying its vessels”.

    The Indian EEZ is endowed with 1,53,996 million tonnes of live mud particularly off Gujarat and Maharashtra coasts, and 745 million tonnes of construction-grade sand has been found along the Kerala coast. The Bay of Bengal coast (Odisha, Andhra Pradesh and Tamil Nadu) and the Arabian Sea coast (Maharashtra and Kerala) are rich in heavy mineral and Polymetallic Ferro-Manganese nodules are available in the Andaman Sea and waters off Lakshadweep islands.

    Polymetallic nodules (Copper, Cobalt, Nickel, Manganese, Rare earth, etc.) are particularly important to support India’s mission to promote the use of clean energy. In November 2022, during the G20 summit in Indonesia Prime Minister Narendra Modi told the participating countries that by 2030, half of India’s electricity will be “generated from renewable sources,”

    The Offshore Areas Mineral (Development and Regulation) Amendment Bill 2023, among many issues, has introduced a number of initiatives including the “auction” of offshore mineral exploration sites and mining rights to companies, including from the private sector, thus creating a level playing field for business competition. The Bill provides for two types of operating rights through auction to the private sector (a) production lease and (b) composite license. It merits mention that the provision for “renewal of production leases has been scrapped with a 50-year lease period to remove uncertainty for operators” which will “give confidence to investors by bringing in transparency and fair play,”

    Seabed mining offers new vistas for business partnerships and joint ventures among different industries in the offshore mining supply chains. For instance, lifting of the extracted ore and carrying it to storage sites ashore is an opportunity for the maritime transportation sector. Similarly, environmental impact assessment, and restoration techniques when needed is a unique industry. Likewise, Industry 4.0 technology developers have opportunities to support Marine Spatial Planning (MSP), bio-remediation, bio-prospecting, and a variety of other seabed mining sectors.

    This article was published earlier in kalingainternational.com

    Feature Image Credit: euronews.com

  • Why India risks a quantum tech brain drain

    Why India risks a quantum tech brain drain

    Clear career progression would help India’s quantum workforce and avoid a brain drain overseas

    India could lose its best quantum tech talent if the industry doesn’t get its act together.

    Quantum technology has the potential to revolutionise our lives through speeds which once seemed like science fiction.

    India is one of a few nations with national quantum initiatives and it stands on the threshold of potentially enormous technological and social benefits.

    The National Quantum Mission, approved by the national cabinet in April, is a timely government initiative that has the potential to catapult India to a global leader leading in quantum research and technologies if leveraged correctly.

    Its main areas of research are quantum computing, secure quantum communications, quantum sensing and metrology and quantum materials.

    The challenge for India is how it ensures it gets the best out of the mission.

    The benefits of the technology can benefit many aspects of society through processing power, accuracy and speed and can positively impact health, drug research, finance and economics.

    Similarly, quantum security can revolutionise security in strategic communication sectors including defence, banking, health records and personal data.

    Quantum sensors can enable better GPS services through atomic clocks and high-precision imaging while quantum materials research can act as an enabler for more quantum technologies.

    But the Indian quantum ecosystem is still academia-centric.

    India’s Department of Science and Technology had set up a pilot programme on Quantum Enabled Science and Technologies — a precursor to the National Quantum Mission.

    As a result, India has a large number of young and energetic researchers, working at places such as RRI Bangalore, TIFR and IIT Delhi who have put an infrastructure in place for the next generation quantum experiments with capabilities in different quantum technology platforms. These include quantum security through free space, fibres as well-integrated photonics, quantum sensing and metrology.

    The prospects and impact of quantum technologies will be hugely strategic. Predictions suggest quantum computing will have a profound impact on financial services, logistics, transportation, aerospace and automotive, materials science, energy, agriculture, pharmaceuticals and healthcare, and cybersecurity. All of these areas are strategic on macroeconomic and national security scales.

    Even as it has taken significant policy initiative to kickstart research into quantum technologies, India will need to craft a national strategy with a long-term perspective and nurture and develop its research work force.

    Clear career progression would help India’s quantum workforce. The risk of brain drain, where local talent moves overseas for better opportunities, could be a real possibility if different industries which can benefit from the technology fail to recognise its transformative capabilities and how it can help create jobs and opportunities.

    While there are multiple labs working in different quantum sectors, the career path of students and post-doctoral researchers remains unclear as there are not enough positions in the academic sector.

    One problem is industry and academia are competing with each other for quantum research funding which is why equal emphasis on quantum technology development in the industrial sector could help.

    While India does have some quantum start-ups, more lab-to-market innovations which would make the technology practically useful could give the field momentum. Currently, the big industrial firms in India are not yet committed to quantum technology.

    The lack of homegrown technologies like optical, optomechanical and electronic components for precision research is another impediment. Most of these are imported, resulting in financial drain and long delays in research.

    The National Quantum Mission could help fix a number of these problems.

    Hurdles could be turned into opportunities if more start-ups and established industries were to manufacture high-end quantum technology enabling products in India.

    Another major deterrent is the lack of coordination. Multiple efforts to develop and research the technology, across government and start-ups, do not seem to have coherence and still lack maturity. People involved in quantum research are hopeful the mission will help address this.

    Like most other countries, India has witnessed plenty of hype about quantum research. While this may help provide a short-term boost to the field, excessive hype can lead to unrealistic expectations.

    Continuing to build a skilled workforce and a clear career progression plan for those involved in research and development of quantum technologies can help secure India’s future in this space.

    There is a distinction between magic and miracles and while believing in one, one should not start expecting the latter as that can only lead to disappointment in the long run.

     

    This article was originally published under Creative Commons by 360info™.