Writings

The World Bank and HIV/AIDS Governance in Sub-Saharan Africa: Perpetrators or Defenders against the Epidemic?

Since the early 1980s, the HIV/AIDS epidemic has devastated the social, economic, and political landscape of sub-Saharan Africa, and has continued to impose a significant challenge against the region’s development. Indeed, the widespread proliferation of the disease has led to the immense regression of many sub-Saharan African countries. According to UNAIDS in 2013, there were an estimated 24.7 million people living with HIV in sub-Saharan Africa – nearly 71% of the global total, with 1.1 million people dying from AIDS-related illnesses.

In the fight against HIV/AIDs, organisations such as the Global Fund to Fight AIDS, Tuberculosis and Malaria (GFATM) and the World Health Organisation (WHO) have often been regarded as the international forerunners in HIV/AIDS governance in sub-Saharan Africa. However,  the role and impact of the World Bank on HIV/AIDS in the region has largely been ignored within public health and public policy literature. In fact, the World Bank have been pivotal to the proficiency of the sub-Saharan African response to HIV/AIDS at all levels of governance, playing a decisive role in both exacerbating, and reducing the effects of the disease throughout the region.

*

The role of the World Bank on HIV/AIDS governance originates from the organisation’s recovery programme for sub-Saharan African economies during the 1980s and 1990s, principally through Structural Adjustment Programmes (SAP). However, the World Bank’s market-oriented lending policy in sub-Saharan Africa was crucial in weakening the functionality of sub-Saharan African governments prior to the outbreak of the epidemic. The strict conditionalities that formed part of SAPs in order to encourage political and economic reform had extensive destabilising effects on sub-Saharan African societies, particularly on the delivery of basic services such healthcare, education and welfare. In implementing austerity measures under SAPs, social spending in sub-Saharan Africa per capita declined to 76 per cent of the level spent in 1981 by the 2000s, impoverishing several countries throughout the region, and was exceptionally damaging for the poorest individuals. This is despite the fact that there is little conclusive evidence that SAPs had a positive impact on economic growth in sub-Saharan Africa.

The depleted basic services helped to reduce the governmental response to HIV/AIDS, and shaped the conditions for the proliferation of the disease. Inflated food prices lessened employment and trade opportunities, particularly in rural areas – increasing the threat of the disease by encouraging rural migration to urban areas. Moreover, the introduction of user fees in health and educational services decreased the accessibility of treatment and HIV/AIDS awareness. In fact, basic services themselves – such as health centres, became sources of HIV infection. Cuts to civil services also restricted the capacity of several governments, deteriorating the administrative ability of sub-Saharan Africa countries in addressing the disease.

The World Bank’s response to HIV/AIDS in sub-Saharan African countries during the adjustment period was also limited. World Bank funding for HIV/AIDS projects began in 1986, however the organisation’s support remained limited over the next decade. Prior to 2000, the World Bank funded a number of generic projects with HIV/AIDS components throughout Africa, none of which exceeded US$10 million. Three projects in Zimbabwe, Uganda and Kenya did receive more substantial funding from the Bank in the early 1990s, however these projects focused more broadly on sexually transmitted infections (STIs). Although the World Bank’s relative inactivity during this period echoes the dormancy of other multilateral organisations, and the general misunderstanding and stigma surrounding HIV/AIDS at the time, as academic Chris Simms argued the World Bank’s significant influence and leadership in the region should have warranted a greater response against the disease. Undoubtedly, the World Bank’s inactivity and its agenda for reform through SAPs contributed to the catastrophic impact HIV/AIDs would have on sub-Saharan Africa. In fact, between the early 1980s and 2000s, people living with HIV infection in sub-Saharan Africa increased from less than one million to 22 million.

15120lpr-700x450

However, the growing severity of HIV/AIDS on sub-Saharan African development, and its threat to human and national security in the absence of an effective cure would eventually prompt the World Bank to take emergency action against the disease. Thus, in 2000  the Multi-Country HIV/AIDS Programme (MAP) was launched. Under the MAP, the World Bank would pledge over $1 billion into the fight against HIV/AIDS, drastically expanding the Bank’s financial commitments against the disease. The programme was designed to scale up state-led responses to the epidemic, underpinned by a multi-sectoral framework that encouraged the participation of the public and private sector, NGOs and community organisations in anti-HIV/AIDS projects in the region. The MAP elevated HIV/AIDS governance to the pinnacle of the organisation’s development agenda in sub-Saharan Africa.

The MAP and the extensive impact of the disease on development in the region provoked a shift in the way the World Bank financed governments in sub-Saharan Africa. In contrast to the stringent conditionalities under the SAPs, MAP funding was issued without a specific link to macroeconomic performance. This allowed countries to scale up their response to HIV/AIDS without the explicit strain of further economic liberalisation. However, although the MAP appeared to represent a ‘softer’ financial relationship between the World Bank and sub-Saharan Africa, the organisation’s wider policy of ‘good governance’ and its neoliberal dogma remained intact. MAP funding may have not imposed strict economic conditionalities on sub-Saharan Africa, but it still required states to meet specific eligibility criteria that encouraged the decentralisation of HIV/AIDS governance.

It is true that the MAP did facilitate a number of breakthroughs in the governance of HIV/AIDS in sub-Saharan Africa. The MAP increased the political commitment towards HIV/AIDS at the highest government level, particularly through the creation of national AIDS councils. The programme also succeeded in promoting a multi-sectoral, decentralised response to HIV/AIDS by allowing national, regional and local actors to play an active role against the disease. Furthermore, the MAP also helped to mobilise larger donor initiatives such as the GFATM and the President’s Emergency Plan For AIDS Relief (PEPFAR), strengthening the response to the disease. As of 2008, development partner funding had increased by 2,240% and 502,958 people infected or affected by the disease were receiving support.

However, the MAP should not be exempted from its flaws; the programme had a consistent issue relating to the non-accessibility of funding for projects, particularly damaging for the most vulnerable in the region. This was as a result of the MAPs inability to remove the weak institutional capacity within sub-Saharan African governments, harming service delivery and the coordination of donor efforts to public and private actors, and communities. The MAP also failed to support effective monitoring and evaluation systems, which affected the veracity of HIV/AIDS outcome indicators, though efforts were made to address this in phase 2 of the MAP in 2006. The clearest indication of the MAP’s shortcomings is that HIV prevalence and infection rates have remained relatively unchanged in sub-Saharan Africa. Thus, while the MAP revolutionised sub-Saharan African HIV/AIDS governance, and produced a number of positive outcomes, its overall success is questionable.

*

The World Bank’s restructuring of sub-Saharan African governance through the SAP significantly undermined the capacity of the region’s governments against HIV/AIDS, with the adjustment programmes enervating vital preventative functions against the disease, and as a result, propagating its escalation. However, by the new millennium the growing impact of HIV/AIDS on the development of sub-Saharan Africa would inspire the World Bank to redefine its approach to HIV/AIDS, and its governance reform measures in the region overall. This led to the formulation of the MAP, placing the fight against HIV/AIDS at the centre of the World Bank’s relationship with sub-Saharan Africa, and drastically increasing the organisation’s obligation towards reducing the effects of the epidemic. However, the MAP has failed to convincingly tackle HIV/AIDS, and has once again demonstrated the uneasiness in implementing the World Bank’s philosophy on governance in sub-Saharan Africa. As such, it remains unclear whether the World Bank can be considered as the perpetrators or defenders against HIV/AIDS in sub-Saharan Africa. However, could the World Bank – or indeed any other organisation have done more against the disease?

The impact of neoliberalism on post-colonial Africa in the 20th century

The decolonisation of Africa in the 1950s and 1960s was seen as the great opportunity for the continent to finally realise its potential independently. Spurred by the sustained demands for self-determination by leading nationalists such as Jomo Kenyatta and Kwame Nkrumah, many countries throughout Africa would take back their sovereignty from their European possessors. For the first time in more than a century, Africans had once again acquired control of their resource-rich continent, and could now build upon their individual liberation movements and fulfil their ambitions to not only compete with, but overtake their former oppressors. However, merely twenty years after independence several African countries encountered severe economic complications. Thus, as Africa’s principal development partner, the World Bank, alongside its partner organisation the International Monetary Fund (IMF) stepped in by supplying loans to cash-strapped African economies through Structural Adjustment Programmes (SAPs). However, the SAPs – and the programme’s specific emphasis on neoliberal reform, would further disintegrate Africa’s economic capabilities, and entrench poverty throughout the region by the end of the 20th century.

African economies had originally performed relatively well after independence. As Nana K. Poku stated, during the 1960s gross domestic product (GDP) and exports grew at rates comparable to other main developing regions at the time, and generally better than South Asian countries. Economic growth in the continent was stimulated by an emphasis on industrialisation to reduce dependency on manufactured imports (with domestic agriculture being undervalued in the process), and a significant increase in government-led initiatives to strengthen the public sector. Motivated by the popular socialist economic principles of the Cold War, and the large sums of donor support from international financial institutions (IFIs) and developed countries, government expenditure accelerated throughout Africa. On the contrary, the private sector remained relatively dormant during this period. African economic policy during the 1960s led to major investments in infrastructure, and created an expansion in public health and primary education.

Unfortunately, by the 1970s African economies began to take a dramatic turn for the worst, negating the positive gains made a decade earlier. There were several factors that perpetuated the economic decline in Africa. Internally, between 1972-73 the continent contended with a severe drought that affected large parts of the region, shattering the already under-utilised agricultural sectors. This caused food and livestock shortages across many countries, stimulating forced migration and the inflow of refugees. Moreover, growing political instability and corruption throughout Africa resulted in the mismanagement of government enterprises, and led to the poor maintenance of highly expensive machinery and domestic labour forces.

There were also external shocks that threatened Africa’s economic sustainability. The Organisation of Petroleum-Exporting Countries (OPEC) decision to dramatically increase oil prices in 1973 devastated the economies of most African countries, who now had to deplete their foreign exchange reserves and acquire foreign loans to import oil. In fact, oil imports as a percentage of export earnings rose from 4.4% in 1970 to 23.2& by 1980. However, the decision was incredibly beneficial for oil exporting states, including African countries such as Nigeria and Gabon, who were able to maximise their profits at the expense of their continental counterparts.

There was also a fall in the price of African commodities, with some commodity prices becoming lower than the cost of production for the commodity itself. This was propagated by the low consumer demand in the West emanating from the economic recession of the late 1970s. The recession intensified Western protectionism, reducing the commitment of Western governments to IFIs, while simultaneously increasing the interest rates on loans obtained by African countries. This coincided with the gradual move away from Keynesian economics in favour of neoliberalism – spearheaded by the ‘New Right,’ and manifested by the policies of  Margaret Thatcher and Ronald Reagan. As a result, the attempts by the Organisation of African Unity (OAU) to quell the economic disparities through the state-driven Lagos Plan of Action were largely disapproved of by IFIs and Western governments in favour of economic reform based on free market capitalism.

Despite the internal and external shocks, the World Bank and the IMF argued that the financial difficulties in Africa were largely attributed to the defective economic policies of domestic governments, particularly the excessiveness of state intervention and their gross resource mismanagement. Thus, in order to stimulate positive growth in Africa, the World Bank and IMF proposed under the principles of the ‘Washington Consensus’ that African governments should refrain from intervening in their economies and social services, liberate market forces by increasing privatisation and deregulation, reduce trade barriers and price controls, and impose strict fiscal and credit policies to control government  expenditure.

Under the Structural Adjustment Programme (SAP) the World Bank would attach the aforementioned neoliberal reforms as conditionalities to their lending policy, with the IMF assisting with the macroeconomic development of the region. This subsequently forced indebted African countries across the continent to adopt the adjustment policies ascribed by the World Bank to acquire the vitally needed loans from the organisation to alleviate the existing financial strains on their economies. Though the SAP were expected to encourage long-term growth, throughout the 1980s and 1990s the implementation of SAPs was extremely damaging to social, economic and political conditions in Africa.

SAPs had an incredibly detrimental impact on the delivery of basic services in sub-Saharan Africa, particularly in the provision of healthcare, education and welfare. The economic reform packages severely rationalised government expenditure commitments, with the introduction of user fees diminishing access to social services, particularly for the most vulnerable in African society. background-20_295x215During the two “lost decades” of the 1980s and 1990s, social spending in sub-Saharan Africa declined per capita to 76 per cent of the level spent in 1981. The reductions to civil services also diminished government capacity, deteriorating the administrative ability of African countries to address public concerns. Furthermore, liberalisation policies inflated the price of food, and lessened employment and trade opportunities, particularly in rural areas. This caused widespread poverty, and lead to the proliferation of disease, especially in the case of HIV/AIDS and tuberculosis. The anticipated transformation in private enterprise through the SAPs failed to stimulate growth, creating deeper holes in the finances of already-depleted services, and escalating social inequality.

The negative effects of the SAPs was compound by the fact that it damaged the ability of African governments to service their foreign debts. In 1980, at the onset of the World Bank and IMF’s intervention in Africa, the ratios of debt to gross domestic product (GDP) and exports of goods and services were respectively 23.4% and 65.2%. By 1990, they had deteriorated to respectively 63.0% and 210.0%. African government can only pay off debts from earnings in foreign currency (through exports, aid or from additional loans), making it incredibly difficult for countries to manage their debts.

The scale of problem is overwhelmingly illustrated by the fact between 1980 and 2000, Sub-Saharan African countries had paid more than $240 billion as debt service, that is, about four times the amount of their debt in 1980. Moreover, by the end of the 20th century all African countries with the exception of South Africa were spending more on debt repayments than on domestic healthcare. The World Bank and IMF did introduce the Heavily Indebted Poor Countries (HIPC) initiative in 1996,  providing debt relief and low-interest loans to the world’s poorest countries. However, eligibility for the initiative hinged on the successful implementation of SAPs, which as stated above, often led to the decline of Africa countries in any case.

The inception of the HIPC coincided with the introduction of Poverty Reduction Strategy Papers (PRSP), which eventually replaced SAPs in 2002 with a poverty-focused approach to economic development. Indeed, the World Bank would ultimately acknowledge that the SAPs had very little impact on economic growth in several of its evaluation reports on sub-Saharan Africa. With that said, rather than to place the blame on the destructive nature of their neoliberal economic agenda, the World Bank argued that African governments had once again failed to manage their economies effectively, and that they had poorly implemented SAPs.

The PRSP has placed a greater emphasis on reducing poverty in Africa, and has led to a more country-driven direction towards development , unlike the “one size fits all” approach under the SAP. Despite this, PRSPs have continued the World Bank and IMF’s insistence on neoliberal reform, with both organisations maintaining the imposition of conditionalities on their lending policy. The positive impact of PRSPs remains questionable, and have ultimately failed to eradicate the extensive criticisms and negative discernments aimed towards the World Bank and IMF for causing needless hardship in Africa that continues to this day.

Should the United Kingdom leave the European Union?: Challenging Eurosceptics on the British economy

On 23rd June 2016, the United Kingdom will vote to decide whether the country is to remain or leave the European Union. For the first time in a generation, the general public will have the opportunity to formally declare their position on the U.K.’s membership in the EU, a declaration that could potentially transform not only the country’s political, social and economic landscape, but the landscape of its European and international neighbours. In the lead up to the EU referendum, the main political parties have announced their allegiances to the opposing campaigns, which have either defended the U.K.’s commitment to the EU, or have argued for the country’s abandonment of the European politico-economic project. Within the Conservative Party, the EU referendum has reignited the historic divisions concerning the U.K.’s membership, creating a schism among the Tory leadership.

It is true that both the ‘In’ and ‘Out’ campaign groups have displayed instances of unsavoury scaremongering to encourage support for their respective causes. However, the fear-inciting tactics used in the defence of Brexit are in my view particularly concerning. Proponents of the Leave campaign, such as Nigel Farage, Michael Gove and Boris Johnson, have used the EU referendum to propagate misconceptions regarding the ailing state of the British economy, and promote an aura of hate towards immigrants across the country. The vile effect of the claims made by the ‘Out’ campaign have been exemplified by the brutal murder of Jo Cox MP by far-right activist Thomas Mair, who had previously links to neo-Nazi and pro-Aparthied organisations. But are ‘Brexiters’ correct in suggesting that the U.K. will be able to  improve the economy and control immigration if the country leaves the EU?

One of the central arguments advanced by the ‘Out’ campaign is that a Brexit would allow the U.K. to negotiate its own trade deals with other countries independent of existing EU trade agreements. There are currently no tariff barriers to trade for countries within the EU, creating a single market of goods and services (though non-tariff barriers such as product standards produce limitations on trade). However, EU member states (including the U.K.) are also subject to the trade agreements made with non-EU countries by the European Commission.

The EU trade agreements prevent the U.K. from negotiating separate trade deals with the rest of the world – which may appear suffocating given the rise in economic growth in the developing world, and the ever-increasing U.K. trade surplus with non-EU countries. In comparison, the U.K. has a trade deficit of £61 billion with the EU, with domestic consumer demand outweighing the value of our exports to the continent. In fact, U.K. exports have made a gradual shift from the EU to non-EU countries, with a decline in the EU share of total exports from 54.8 percent to 44.6 percent from 1999 to 2014. For Brexit campaigners, this illustrates the potential of the U.K.’s economic growth in the rest of the world, which could be maximised as a result of leaving the EU.

While it may be advantageous for the ‘Out’ campaign to postulate positive economic outcomes if the U.K. was to abandon the EU, we are simply unable to predict what kind of trading relationship the U.K. will be able to negotiate with the EU, or with the rest of the world. Eurosceptics have argued that the U.K. could negotiate a trade deal with the EU similar to that of Norway and Switzerland. Both countries are not members of the EU, but have access to the single market and can negotiate trade deals with non-EU countries independently. However, Norway and Switzerland are subject to EU rules and policies with limited or no influence, and make budget contributions to the union. Moreover, both countries must abide to the free movement of good, services and most importantly peoples.

On the other hand, Canada has also negotiated tariff free access to the EU, without making any budgetary contributions or the free movement of peoples. Though, the trade agreement took an incredible five years to complete, and is difficult to compare to a potential U.K. model as the EU’s share of Canada’s total exports was less than 10% – significantly less than the U.K’s. In the worst case scenario where the U.K. is unable to make any sort of trade agreement with the EU, the country would be subject to World Trade Organisation (WTO) rules. This would prevent access to the free trade agreements for goods or services with the EU, and the U.K. could face EU and non-EU external tariffs and no access to the single market. This uncertainty over what kind of trade deal can be arranged would heavily impact on the productivity of the U.K.’s economy in the interim, as the EU is the country’s main trading partner, evidently making Brexit a risky, uncalculated gamble.

Another important argument put forward by Eurosceptics regarding the economy is the view that the U.K. could save an estimated “£350 million a week” by leaving the EU – as quoted by Boris johnson in a recent television debate, as the country would no longer have to make a financial contribution to the union. It has been suggested that the £18 billion saved could instead be invested into public services, such as education or health care. However, as the Institute for Fiscal Studies noted, the figure of “£350 million a week” is absurd as it ignores the rebate the U.K. receives from the European Union. If the rebate is included the figure actually stands at £14.4 billion or 0.8% of GDP in 2014 (around £275 million a week).

The EU also spends £4.5 billion on the U.K. in areas such as agriculture and research, thus the U.K. only provides approximately £8 billion a year to the EU budget. Moreover, though the U.K. receive per-capita the lowest spend from the EU of any member state, EU spending is only 1 per cent of our GDP – lower than the EU average, and is a relatively small amount in comparison to the £169 billion spent on health and education alone by the government. It is also important to note that the membership fee also brings with it increased trade, investment and jobs, which the U.K. would lose if it is unable to negotiate a trade deal with the EU. In fact, many economists have revealed that lower economic growth as a result of leaving the EU will negate any savings made from EU contributions anyway.

Arguably the most distinguishable argument of the ‘Out’ campaign is the impact of EU immigration on the U.K. economy. UKIP leader Nigel Farage in particular has been defiant in his attempts to reduce immigration from the EU into the U.K., most demonstrable by the party’s  recent “breaking point” campaign poster. Nigel-Farage-UKIP-436573However there is no conclusive evidence to suggest that EU immigrants cause a strain on the British economy, with most studies suggesting that their impact is relatively small, costing or contributing less than 1% of  the U.K.’s GDP. Rather, a recent UCL report stated that since 2000, immigrants have made a more positive fiscal contribution than longer-established migrants – £5 billion from immigrants who entered the U.K. from countries that joined the EU in 2004. There is also evidence to suggest that EU immigrants pay more in tax than they gain in welfare or public services, and they have a higher net fiscal impact than non-EU migrants.

In any case, leaving the EU may not lead to a significant reduction in immigration. As alluded to above, the kind of trade deal the U.K. is able to negotiate with the EU after “Brexit” is pivotal to controlling immigration, as access to the EU single market is also likely to allow for the free movement of people. Therefore, it may require the U.K to leave the single market altogether in order to effectively control immigration, which could neverthless have a negative impact on the economy. In my view, the argument relating to the impact of immigrants is largely cultural; as Nigel Farage proclaimed “the country has become unrecognisable.” However, while the cultural impact of EU immigration is questionable, there is enough evidence to suggest that the economic value of EU immigration is likely to be beneficial to the U.K., counterpoising any superficial arguments that a Brexit will conclusively have a positive impact on the U.K. economy.

Finally, let us not forget the overwhelming consensus of economic institutions and researchers and businesses (according to the British Chambers of Commerce) that argue that the U.K. must remain in the EU in order to preserve economic stability and continued growth. There is one notable exception to this trend – the suitably named Economists for Brexit.  Supported by Patrick Minford and Roger Bootle, Economists for Brexit have predicted a net benefit to the U.K. economy. However, even the most comprehensive analysis on Brexit by independent think tank Open Europe has predicted that even if the U.K. is able to negotiate a trade deal GDP  could increase by 1.6% by 2030. However, if a trade deal is not negotiated GDP could be lower than 2.2% in the same period.

The argument against Eurosceptics is clear. I plead with the people of this great land, do not put our country at economic risk.

Reimagining race: Should the colour of your skin still matter?

Since the dawn of mankind, human beings have always used various forms of identification to distinguish themselves from their counterparts.   In the contemporary world, social identity has been shaped by notions such as a person’s religious beliefs, cultural attitudes, sexual orientation, or their disposition towards a particular gender group. However, it is the categorisation of ‘race’ that serves as the  fundamental component of our identity today, structuring civilisation into a complex intermixture based on the colour of our skin. This structure has repeatedly produced adverse implications on the cohesion of humanity, both on a local and global scale.

Unlike other forms of identification, the modern international system of capitalism was built upon the conception of ‘race,’ emanating from the exploits of slavery and colonialism. Indeed, in order to rationalise the seizure of human beings, territory and resources, European explorers categorised people across the world according to their pigmentation, building a meticulous framework that not only emphasised their superiority, but demonised and subordinated the ‘alien’ populations they encountered. Supported by the scientific revolution of the early modern period, and the bastardisation of Christian beliefs by proponents of the church, Europeans were able to justify the fabricated damnation of people considered ‘black,’ and the hierarchical aggregation of  other ‘coloured’ peoples.

black-white-arms

Despite the abolition of slavery, the decolonisation of large parts of the world, and the global fight for civil rights against racial segregation and discrimination, the differences in our skin colour remains an impediment against social justice, and contributes to the existing divisions already present in the world today. ‘Race’ continues to hold a transcendent influence on the provision of wealth and economic opportunity, health, education, friendly and intimate relationships, civil obedience, and on other areas of the domestic and international arena, often to the detriment of those formerly subjugated by the racial categories built by imperalists centuries ago. Moreover, while this article refuses to divulge into the common stereotypes associated with different ‘racial’ groups, we all recognise (whether consciously or subconsciously) that they serve as a caveat in the minds of every thinking individual on this planet, ultimately contributing to the maintenance of the unjust racial status-quo.

But why must humanity remain confined to such illusionary ‘racial’ conceptions? The globalised nature of the 21st century has only reinforced the fallacy of the long-standing ‘racial truths’. Thus we, as humans being, must now avoid confining our identities to such preposterous beliefs, which were originally created in order to purposely propagate our separation. ‘Race’ is a social construct, a construct developed by the holders of power to prevent the societal conditions necessary to produce equality. As such, ‘race’ should be considered as menial as the colour of our eyes, the hairs on our heads, or the lines on our hands in order to end the dictative influence of the tendentious racial ideas.

How can social cohesion ever be achieved if I preserve my identity as a ‘black’ man,  reigniting the suffering of peoples that share the same shade of skin, while another preserves their ‘whiteness’ and their ‘ideological superiority’ over the rest of the world? Humanity must agree that our race has no colour, and is not a stimulus for societal domination or subjection. With the elimination of ‘race’, we naturally eliminate the ancestral relationship between the oppressed and the oppressor, the slave and the slave owner, the conquered and the conqueror.

Social differences exist, and they always have, but ‘race’ is not a condition that should lead to our disunity. Humanity is built upon the physical differences between man and woman, the ideological and theological dissimilarities regarding the proof our existence, and the cultural and linguistic distinctions as a result of our individual geographical origins. However, the importance of our skin colour should only have a bearing on our identity if we consider the hierarchy of different racial groups to be intrinsic to our nature – which in my view, is completely baseless.

In reflection, it is time the world re-evaluated the role of ‘race’ in our societies, and its significance in the wider context of our post-imperalist global environment. If my view appears utopian, then unfortunately you are controlled by the deep-rooted psychological manipulation of European expansionism. Wake up.

Food for thought.

Is Western foreign policy to blame for the ‘European migrant crisis’?

Over the last few weeks, the British public have been well-informed of the harrowing experiences faced by migrants travelling to Europe in an attempt to escape the harsh realities in their homeland. As a result, the media frenzy over the ‘European migrant crisis’ has not only raised fears across the continent over the safety and wellbeing of the incoming migrants, but has also led to an increase in xenophobia towards the thousands of escapees in search of a better life. In Britain there has been a growing apprehension and antipathy aimed in the direction of migrants, swayed by a refusal to accept their newfound habitation in the European Union. This view is somewhat epitomised by David Cameron’s use of the word ‘swarm’ in describing the numbers of people crossing the Mediterranean Sea in search of security in Europe.

127901_story__congo

But should Britain and its Western compatriots shoulder some of the blame for the recent influx of migrants to the European Union? In answering my own question, Yes. Of course the West should hold a significant amount of responsibility for the crisis.

First and foremost, while the crisis is commonly being described as an issue concerning ‘migrants,’ in reality a huge percentages of those arriving in Europe are in actual fact refugees seeking asylum. This would suggest that the people arriving in the EU are generally not in search of the financial benefits that the region has to offer – as the term ‘migrant’ has often implied. Rather, people have fled from the horrors of violent conflicts, or political and social persecution in their countries of origin.

However, most importantly the majority of the people claiming refuge have directly (or indirectly) been affected by the often self-seeking foreign policies of the West outside of the EU. The highest number of migrants that have travelled to Europe are from Syria. Since the turn of the year, over 100,000 people have fled from the destructive Syrian Civil War, which has seen the country torn between the regime of Bashar al-Assad and armed militia, including the new proponent of Islamic radicalism the Islamic State of Iraq and Syria (ISIS). Although the West have stopped short of armed intervention in the civil war, the United States and Britain have consistently bolstered the Syrian opposition by supplying intelligence, training and ammunition to groups fighting against the government. This has only helped to excerabate the severity of the conflict, and further tarnish the lives of civilians in Syria.

syrian_refugee_photo__2014_03_21_h12m42s44__DSAfghans also constitute a significant number of refugees arriving in the EU, largely due to the fallout from the US and NATO led War in Afghanistan. The conflict – which was motivated by the September 11 attacks, has caused widespread devastation across the country. Fourteen years later the war continues to plague the citizens of Afghanistan, with the somewhat unsuccessful removal of the Taliban proving in hindsight to be a costly and ruinous venture for the West. According to the United Nations Human Rights Council (UNHRC), over 2 million people have been classified as refugees since hostilities began in Afghanistan.

Like in Syria and Afghanistan, recent Western involvement in Libya and Iraq has extended the extirpation of civil society across the Middle East and North Africa, and as a result has increased the number of refugees entering the EU. In 2011, NATO intervened against the Libyan dictator Muammar Gaddafi following his apparent refusal to cease actions against civilians that the West considered as ‘crimes against humanity.’ Three years later, with the help of his Western allies Barack Obama would also initiate another offensive in Iraq against the growing influence of ISIS in the north of the country.

In both conflicts, the West have to some extent been successful in achieving their military objectives. With that said, the West have also proliferated the humanitarian crisis in Iraq and Libya, and have played a decisive role in worsening the living conditions of civilians. In Iraq alone, the UN Office for the Coordination of Humanitarian Affairs estimates that roughly 5.2 million people now need humanitarian assistance, including food, shelter, clean water, sanitation services, and education support. Furthermore, the demise of national order and security within Libya is embodied by the fact that migrants are repeatedly using its northern border as an escape route to Europe. Incredibly, in August 4000 people travelling from Libya were saved from boats off the coast of the country in what has been described as one of the largest rescue missions during the ‘crisis.’

Away from the Middle East and Libya a growing number of refugees continue to emigrate from Eritrea, which is recognised by the UN to have one of the worst records for human rights. In recent months, thousands of Eritreans have escaped to the shores of Italy in response to the oppressive, single-party state under Isaias Afwerki. However, despite the West’s insistence on protecting the liberties of people across the world – as demonstrated in its exploits in the Middle East and Libya, the plight of the Eritrean people has largely been ignored by Western policymakers and its mainstream media. This is typified by the fact that reports on the European migrant ‘crisis’ have scarcely mentioned the amount of Eritreans entering the EU or the reasoning behind their escape. Unlike in the Middle East and Libya, Eritrea does not present itself as a ‘goldmine’ of natural resources, and as a consequence Eritrea falls outside of the West’s economic and political interests.

As the members of the EU scramble to find a solution to the migrant ‘crisis,’ a multitude of people will continue to risk their lives  to leave behind the trials and tribulations they face in their homelands. Europe has now become the ‘promise land’ for many of the migrants escaping their mother countries, with approximately 340,000 men, women and children having already journeyed through cruel and unsavoury terrain towards the EU border. However, many have now perished in the most inhumane of circumstances on the road to a safer environment that they could one day call home. But will there be a robust resolution to the European migrant ‘crisis’? It is unclear. What is certain is that the West must accept considerable accountability for the plight of migrants and take the necessary steps to make certain that they are given the protection they deserve in Europe. Going forward, it is vital that the West re-evaluate the measures used in relation to its foreign policy and ensure that lessons are learned to avoid a migrant crisis on this scale from occurring in the future.

Should we question the integrity of the Conservatives victory in the 2015 General Election?

On Friday 8th May 2015, David Cameron stood victorious in front of the British media after witnessing the Conservative Party defy the polls and sweep to an unexpected triumph in the general election. In what was one of the most anticipated elections in recent history, the Tories would defeat their election opponents across the country and collect an unprecedented 331 seats. This would ultimately be enough to form an outright majority and eradicate the prospect of a successive coalition government. No longer bound by the chains of the Liberal Democrats – soundly beaten during the general election, and with Labour and UKIP also left with some ‘soul-searching’ to do, Cameron gleefully returns to 10 Downing Street with the future of the United Kingdom in his hands.

Despite David Cameron’s now infamous victory, a burning question remains of great interest to political commentators and the public as a whole; did the Conservative Party actually deserve to win the general election? Yes – if the result of the general election is taken at face value. The Tories won five more than the 326 seats required to form the slender majority administration, with the Party seizing 24 seats from its political rivals and surprisingly increasing its percentage of the vote.

6-David-Cameron-getty

However, if we are to assess the Conservative Party’s share of the vote, and more importantly the often disputed ‘First-Past-The-Post’ voting system, there is an argument that challenges the Tory supremacy over the House of Commons. The Conservative’s did muster an impressive 36.8% of the public vote, but if this percentage is calculated under a system that adheres to proportional representation, the Tories would not have reached the impressive number of seats it tallied at the end of the election.

Arguably as telling was the detrimental effect of ‘First-Past-The-Post’ on the other political parties. Although Labour’s share of seats would have only seen a minimal decrease if it was reflective of the public vote, UKIP and the Liberal Democrats suffered immensely as a result of FPTP. Despite gaining 12.6% and 7.9% of the vote respectively, only 9 seats were gained between UKIP and the Liberal Democrats. Conversely, the SNP – widely applauded for their success in the general election, gained 56 seats despite only obtaining 4.7% of the vote. Whilst the Conservatives would have remained the largest party under a proportional representative voting system, they would have certainly been forced into forming another coalition government and contend with a greater number of diversely affiliated MPs in Parliament. Unfortunately however, the Conservatives can expect a somewhat ‘muzzled’ resistance against their governance over the next five years as a result of the flawed FPTP system.

Another arguement that questions the integrity of the Conservative victory was the indecisive nature of the British electorate and the increased use of ‘scare tactics’ within right-wing politics. The opinion polls right up to the general election suggested that the final result would be too close to call, with Labour fighting tooth and nail with the Conservatives for the right to be in power. However the exit poll would paint an entirely different picture, teasing a Tory victory by implying that they were inches away from the majority they required to govern solitarily.

The ‘Shy Tory Factor’ has been used to theorise the disparity between the opinion polls and the outcome of the general election. Seemingly like in the 1992 general election, a large portion of the British public ‘disguised’ their support for the Conservatives until poling day as a result of their ‘shame’ towards their inclinations to the blue corner of british politics. However, whilst this theory may be true for a portion of the ‘silent’ Tory vote, it appears that many balloters were simply undecided as to who to vote for and made the decision to vote Conservative without having a clear and definitive allegiance to the party on (or days before) polling day.

The decision-making of the ‘undecided voter’ during the election was a reflection of how the right had instilled fear into the electorate. The Conservatives – and more profoundly UKIP, flooded the British public with misconceptions over Labour’s destruction of the economy and immigration, whilst also igniting the subject of Britain’s exclusion from the European Union. This ultimately proved to be a successful tool in dramatically increasing the share of the vote for the Right and consequently destroyed the possibility of the Labour Party gaining a majority in Parliament. Roy Greenslade expressed the view that in the five years leading up to the election, the right-wing press continually propagated the views of the Conservatives and UKIP and gave ‘disproportionately favourable coverage to Nigel Farage and his party.’

Building from this fact, the role of the media cannot be underestimated in swaying public opinion, and fundamentally providing the Conservatives with the platform to win the general election. The Conservatives were backed by six major newspapers – The Sun, The Telegraph, Financial Times, Daily Mail, The Independent and the Times, all of which bombarded its readers with Tory propaganda and anti-Labour hyperbole in attempt to maximise the Conservative vote. The Sun’s insistent character assassination of Ed Miliband was particularly noteworthy in the lead up to the election, damaging the reputation of the former Labour leader and his credentials as the future Prime Minister. The newspaper’s ‘Save Our Bacon’ headline a day before the elections was decisive in sealing the fate of Miliband’s election campaign and was arguably the pinnacle of the Sun’s attempts to persuade its 5 million-plus readership to support David Cameron.

256679_1

Broadcasters also appeared to be pro-Conservative; Sky News – another media organisation that Rupert Murdorch has vested interest in, unsurprisingly reproduced the favouritism it showed towards the Tories in the 2010 general election. This was evident during the first televised election debates, where there was no question that Ed Miliband received a far stringent assault against his character from Jeremy Paxman and Kay Burley than his Conservative adversary, including personal jibes at his relationship with his brother.

Contrary to popular belief, The BBC have also been noted for their bias towards the Tories. As Phil Harrison eloquently put it in his article ‘Keeping An Eye On Auntie: The BBC & Pro-Tory Bias,’ the BBC lent increasingly towards the right in order to appease the ‘status quo’ of capitalist realism that has been promoted by the Conservatives. The lack of media attention across all of the major broadcasting organisations following the recent protests against the government only days after the election only perpetuates the overwhelming media disposition towards the Tories.

Now that the dust has settled from an extraordinary general election, the Conservatives will go down in folklore as the ‘conquerers of coalition politics’ despite the universal expectation of a hung parliament. However as this article indicates, the Tory victory seeps of unworthiness, not least due to a widely criticised voting system, an electorate with ambiguous political affiliations and a media environment that showed considerable support towards David Cameron and his party – as well as the Right in general. Of course it would be wrong to deny the deficiencies in the Labour election campaign, though there is sufficient evidence to suggest that the Conservatives were given more than a helping hand towards their election achievements. Nevertheless, be sure to expect the Conservative’s austerity programme go into full swing and the continual vilification of Labour’s ability to govern the country again by a pro-Conservative press. Oh the despair of another five years with Mr Cameron at the helm of this wonderful country.

The Île-de-France attacks and the proliferation of Western ‘Islamophobia’

Before I begin, I must stress that I in no way, shape or form condone the actions of the gunmen who shot dead twelve people after storming the offices of satirical newspaper Charlie Hebdo, or the subsequent acts of terror that took place in the Île-de-France region thereafter. Moreover, I do not sympathise with the aims and objectives of al-Quaeda, Islamic State, Boko Haram, or any other group associated with Islamic radicalism. However, I have certainly struggled to comprehend the growing antipathy towards muslim communities – and against the integrity of Islam, as a result of the fatal shootings in the French capital. In the last few weeks ‘Islamophobia’ has heightened across the Western world, perpetuated by the reactions of politicians, various media outlets, and other commentators, leading to a profusion of verbal and physical attacks against Muslims, most notably epitomised by the Chapel Hill shootings.

islmph-bw

On 7 January 2015, Saïd and Chérif Kouachi would initiate a chain of attacks that would stun the world, generating immidiate attention worldwide and causing shockwaves across social media. In total, 17 people were killed with many others wounded, in what was the deadliest act of terrorism since the Vitry-Le-François bombings of 1961. The events would receive widespread condemnation from across the world, most evidently from Britain, the United States, and Israel who voiced their detestation and reinforced their aims to tackle Islamic radicalism. The French people would also rally against the actions of the terrorists, with the rest of the Western world quickly following suit under the banner of “Je Suis Charlie.”

However, despite the public knowledge that the terror attacks had been motivated by Charlie Hebdo’s controversial lampooning of the Prophet Muhammad in the newspaper’s previous publications, the West comprehensively defended Charlie Hebdo’s depictions of the ‘founding father of Islam’ in support of the liberal concept of ‘Freedom of Speech.’ This directly challenged a key Islamic principle that forbids images of Muhammad and in turn, was an demonstration of the West’s attempt to sow a dissension between Islam and Western ideology after the Île-de-France attacks. This was exemplified further by Barack Obama’s and David Cameron’s statement that they would “continue to stand together against those who threatened their (the West) values and their way of life” and along with France, they made it clear to those “who think they can muzzle freedom of speech and expression with violence that their voices will only grow louder.” There was a defiance to remove all accountability from the publications of Charlie Hebdo, and an unwavering ignorance by the West of the undeniable disrespect shown by the newspaper against a faith followed by millions of people across the world.

charlie_hebdo_newcover

Not only did the Western world largely disregard the argument that the portrayals of Muhammad were a suitable catalyst for the Île-de-France attacks, but Charlie Hebdo would also receive unprecedented support for their first edition immediately after the events in Paris – that once again depicted the Prophet Muhammad satirically. The edition of 14 January 2015 sold over seven million copies in contrast to the standard nominal amount of 30,000. This exposed the scale of the support for ‘Freedom of Speech’ and the extent of the West’s insolence towards Islam and negligence to what the religion’s believers considered as blasphemy. The wide-scale publication of the edition also served to disillusion a large number of Muslims across Europe and the United States and was yet another example of the West’s naivety in relation to the impact the publication’s success could have in fuelling Islamic radicalism further.

‘Islamophobia’ was not only confined to the events surrounding Paris but manifested itself into a number of other ways; a week after the Île-de-France attacks, 128 ‘anti-Muslim’ incidences were registered with the French Police in comparison to a mere 133 in the whole of 2014. In fact, many of the incidents included shootings, attacks against mosques and threats or insults, many of which received minimal or no media attention. This illustrates the considerable disregard for the welfare of Muslims in France and is evidence of the contrasting approach to acts of terror when Muslim communities are on the receiving end. Furthermore, there were many reports of Islamophobia in Germany, United States and other parts of the Western world.

As predicted, the Île-de-France attacks invigorated the far-right movement, with the likes of Marine le Pen and Geert Wilders using the events to rationalise their anti-muslim agendas. There were also other high-profile episodes of ‘Islamophobia’ on social media; for example, Rupert Murdorch’s comments on Twitter that suggested that Muslims must be held responsible for the acts of terror. This is unsurprising given the racist nature of some of Mr Murdorch’s statements in the past, and the fact that Twitter and Facebook has been used as a fitting avenue for the flourishing of ‘Islamophobia’ as per an investigation by the Independent. The views of terrorism ‘expert’ Steven Emerson, who falsely claimed that Birmingham was a completely ‘Muslim city’ and that in areas of London (and I quote) “there are actually Muslim religious police that actually beat and actually wound seriously anyone who doesn’t dress according to religious Muslim attire,” only reiterates my argument that Western ‘Islamophobia’ is on the rise.

Arguably the most identifiable instance of ‘islamophobia’ was carried out on the 10 February 2015, commonly known as the Chapel Hill shootings. Merely a month after the Île-de-France attacks, three innocent Muslims were shot dead by Craig Hicks in another incident of terror. In what has largely been reduced to an ‘isolated dispute over parking,’ there is sufficient evidence to suggest that the murders conducted by Craig Hicks were motivated by hate and his aversion towards the religion of his victims. Again, unlike the Île-de-France attacks, the Chapel Hill shooting was also given minimal attention from Western media, and received a deafening silence by the very politicians who argued so valiantly against the atrocities in France. An American news company even went as far as interviewing Craig Hick’s wife, during which she was given the opportunity to defend her husband’s actions, and proclaim his efforts in ‘championing the rights of others’ despite considerable reports of his ‘gun happy’ and extreme atheist tendencies. Undoubtedly, it would be hard to imagine Western media ever interviewing the wife of an Islamic man after orchestrating a potential hate crime, particularly in a world where Islam is often feared and repudiated.

US-Islamophobia-CAIR-UC-Berkeley

As demonstrated, there has been a dramatic rise in ‘Islamophobia’ in recent weeks since the Île-de-France attacks that have once again served to stigmatise and vilify Islam and its worshippers. A concept that has solidified in Western culture after the events of September 11, ‘Islamophobia’ has resurfaced to target muslim communities and reinforce the historical divisions between the Islamic and Western Judeo-Christian civilisations. Can we foresee an end to the cultural persecution of Islam? Maybe, however if if remains the West’s objective to antagonise an entire religion for the actions of an isolated few, the world may not see a cessation of ‘Islamophobia’ for a considerable time yet.

 

The regeneration of Hackney – the saviour of a borough in despair

Historically, the London Borough of Hackney was renowned as one of the poorest areas in Britain, plagued by widespread poverty, unmanageable social tensions and tormented by an endemic of criminal activity. From the derelict-stricken housing estates to the considerable deficiencies within the borough’s schools and hospitals, throughout all walks of life its residents encountered incredible scenes of marked deprivation. However, since the turn of the century Hackney has emerged as a vibrant centre of affluence, propagated by the booming property market in the area and a growing incentive by local government to address the wealth inadequacies – supported by the advent of the 2012 Olympics. With that said, an influx of middle class “voyagers”  have solidified in Hackney coinciding with the flourishing atmosphere within the borough, submerging with a working class population well aware of the relative indigences of yesteryear. Ultimately however, it would be difficult to argue against the fact that the emanation of the regeneration and gentrification of Hackney has not only revitalised the borough, but removed Hackney from its incredibly ruinous past.46_big

Over the last two decades Hackney has made a considerable effort to purge the economic and social shortcomings. Initiated in the late 1980s, the Council planned to rid the borough of its “sink estates” resulting in the demolition of Trowbridge, Clapton Park, Nightingale (right), Holly Street (where 80% of residents had applied for a transfer) and the Kingshold Estate. The Woodberry Down, Haggerston, Kings Crescent and Pembury Estates are also currently facing reconstruction. In its place more traditional low-rise housing has appeared, along with a plethora of privately-owned developments. Moreover, since 2006 under the Decent Homes programme the Council have invested over £184 million in renovating thousands of existing homes.

2-590x484

The closure of Hackney Downs school, Kingsland school and Homerton College of Technology due to the below-par performances of its students and recurrent behavioural issues stimulated the emergence of innovative academies, commenced by the Mossbourne Academy in 2004. This, along with the Learning Trust’s dominion over education in Hackney has led to the rebuilding of all secondary schools and the implementation of constructional improvements to primary schools across the borough, decisively improving education. From 2006 to 2013, GCSE results (5 A*-C) increased from 50.9 % to a staggering 79.6%. Corroborated by the advancements in health care and transport – particularly with the expansion of the London Overground subsequent to the completion of the East London Line, ultimately the fortunes of the borough have taken a considerable turn for the better. There have also been increased attempts by the Metropolitan Police (and more specifically Operation Trident) to tackle the eminence of gangs in Hackney after the ill-fated riots of 2011.

Undoubtedly, the recent trends in Hackney have not only statistically reduced crime and poverty rates across the borough, but more importantly have been effective in dissolving the established adverse reputation of the borough. The Metro newspaper recently ranked Hackney as the 2nd best borough in London – remarkable considering Hackney was perceived in the past as the worst place to live in Britain. Furthermore, benefit claimants have reduced by 6% since 2006, employment rates have steadily risen in the same period to 63.7% and across Hackney deprivation has seen a sharp fall, particularly in the Haggerston, Clissold and Lordship wards. These factors have intertwined with the shift in the demographics in the area from a low-income, impoverished community to a prosperous and blossoming place to reside and visit.

Whilst it would be difficult to suggest that the regeneration of Hackney can completely disguise the remnants of poverty in the borough and avoid an ‘indigenous’ population grieving with antagonisms towards the appearance of wealthier newcomers, it generally has had a positive impact in rescuing the borough from further distortion. 697392-120810-hipstersHow long will the social prosperity last? It is difficult to estimate, however  what is for certain is that there is evidence to suggest that gentrification has conserved my place of origin from returning to the social horrors it was once accustomed to.

 

Is the world still concerned that over 200 schoolgirls remain missing in Nigeria?…

Once Michelle Obama joined the worldwide calls for the restoration of the missing schoolgirls in Nigeria – mercilessly kidnapped by the Islamist militant group Boko Haram, it appeared that the ‘Bring Back Our Girls’ campaign had successfully galvanised the world’s leaders into actioning the swift return of the young children. However despite a growing international concern and the various measures implemented by the Nigerian government in rectifying the predicament, as of yet the schoolgirls remain in captivity. Thus, compounded by the fact that nearly four months have passed since the baneful kidnappings, are we to conclude that the world no longer cares that 200 schoolgirls linger within the hands of an organisation identified as a ‘terrorist’ group and a global threat?

Bring-Back-Our-Girls-Michelle-ObamaWhat is for the certain is that we are aware of where the abducted schoolgirls are located – at least theoretically. On 26 May 2014, a Nigerian military official confirmed that they had ‘found’ the missing girls, yet for security reasons they could not disclose their whereabouts nor could they use force to rescue them due to the possibility of ‘collateral damage.’

However, it appears that this was a disguise for the evident ineffectiveness of the Nigerian government, who have not only failed to curb and address the actions of Boko Haram in the past (let alone in this instance) but have also allowed the Islamic group to commit over a dozen other crimes throughout the country since the abduction, including the assassination of a Muslim leader in Borno state and the seizure of the vice-prime minister’s wife. As a matter of fact, the Chibok schoolgirl kidnappings is just another episode within a catalogue of atrocities instigated by the radicals and in conjunction, is another example of the incompetence of Goodluck Jonathan and his regime in quelling Boko Haram’s growing influence in Nigeria. The conclusion of the Nigerian inquest into the kidnapping at the end of June epitomised the toothlessness of the Nigerian government, with their findings simply acknowledging the number of girls still under confinement.

bring-back-our-girls

If the lacklustre attempts of the Nigerian government raises questions over whether the return of the kidnapped schoolgirls remains a priority, the actions of the world’s leaders only adds salt to the assumption that the world has turned its back on the students. Undeniably, the ‘Bring Back Our Girls’ campaign led to widespread media coverage and raised attention of the kidnapping on an unprecedented scale, however since the world’s celebrities erected their photographs online displaying their banners of protest, the international emphathy has dramatically dwindled. In the short-term, many nations – including the United States and United Kingdom, offered support in the form of intelligence experts to aid the Nigerian’s in their search for the young children. Though, in the long-term the missing schoolgirls have largely dissapeared from the subconscious of the international political and social arenas and have become a distant memory in response to the advent of the Israeli-Palestinian conflict, the Ebola outbreak and ultimately other pressing internal and external issues. Similarities can be made between the ‘Bring Back Our Girls’ and ‘Kony 2012’ campaigns, both of which became worldwide phenomena overnight but fundamentally depreciated due to a lack of perpetual support for the respective causes in the long haul.

So, is the world still concerned that over 200 schoolgirls remain missing in Nigeria? seemingly not, however it is important that we continue to raise awareness of the abducted students and not confine the crimes of Boko Haram into the abyss of unsolved mysteries. Importantly, although we live in the world where news is constantly changing, we should grow to become more vigilant in sustaining our campaigns against the atrocities we witness – wherever it may be in the world, as people power (as in this case) has proved in past to be an affective lobbying tool in determining the actions of politicians. However, we must refrain from our Western-centric perceptions and apprehend that the plight of anyone in world is parallel to our own understanding of how people should be treated, particularly when it concerns children. Ultimately though, we can only begin to imagine what the reaction would be if over 200 girls went missing at the hands of ‘terrorists’ in the Western world...

Why the Israeli-Palestinian conflict is a product of miscalculated foreign intervention

In the ongoing struggle between Israel and Palestine, one question that surely springs into the minds of the world’s populace is what will (if anything) bring the two conflicting states together? Well what is certainly apparent is that a feasible solution has yet to emerge. Indeed, what has been particularly detrimental to the reconciliation process is the largely unsuccessful interventions of foreign countries in the affairs of the Israeli and Palestinian people.israel_palestine

Historically, the ‘Holy Land’ has always been a matter of contention between the Judeo-Christian and Arab world, however it would be an over assumption to suggest that the conflict between Israel and Palestine stemmed from an uneasy relationship between the Abrahamic religions in the past. As Mark A. Tessler noted in 1994, the conflict is not based on primordial antagonisms, and it has only been over the last century that Jews and Arabs began to view one another as enemies. In reality, it was not until the fall of the Ottoman Empire as a consequence of the  their defeat in the First World War that tensions in the area emerged. More importantly, the fall of the Ottoman dynasty would also stimulate a growing incentive by nations across the world to influence the destiny of Palestine, largely without considering the consequences of their actions.

For centuries, the Ottomans had solidified its domination over Palestine, and had governed the largely Arab population without any significant resistance. Yet, by the turn of the 20th century the Ottoman Empire was struggling to survive, hampered by a weakening economy, internal and external threats to its territory, and a lack of military modernisation – particularly in comparison to its rivals in Europe. Coinciding with the burgeoning Zionist movement – which promoted the idea of a Jewish homeland in the ‘Land of Israel’ (although other areas, such as Uganda and Saudi Arabia were also considered), the future of Palestine became increasingly uncertain. Ultimately however, the fatal decision of the Ottomans to align with the Central Powers would be the catalyst to the tensions in the ‘Holy Land’ in the future.

The Ottoman Empire’s eventual defeat at the hands of British during the First World War would result in Palestine being surrendered to Britain under the terms of the Balfour Declaration. However, although the British had initially signed an agreement with Arab leaders in 1915 which promised the sovereignty of Palestine in exchange for their support in the war effort – culminating in the Arab Revolt in 1916, the Balfour Declaration effectively lobbied for the creation of a Jewish homeland in Palestine in appeasement of Zionist movement. As a result, driven by their own individual interests in the area, Britain had given promises both to the Arabs and Jews over the sovereignty of Palestine, causing either side to claim their right to the land. The falsehood of Britain’s promises would lead to tensions between the Arabs and Jews in Palestine throughout the inter-war period, as Jewish migration dramatically increased. As a consequence, the Palestinian nationalist movement matured in response to the growing strength of the Zionist cause.

The British administration over Palestine would expire in 1948. However, following the events of the  Second World War – the Holocaust and the Arab leadership in Palestine alignment with Nazi Germany (a decision undoubtedly influenced by Britain’s ‘betrayal’ of the Palestinian cause after the First World War), the overall consensus of the recently formed United Nations was to partition Palestine into an independent Jewish and Arab state. This decision was widely objected by the Arabs, particularly due to the fact that the Jewish population in Palestine was a minority at the time. As a result of the creation of Israel, a large portion of the Arab population would be enclosed within the new Jewish state. The post-war world were defiant that the Jewish population’s call for a home would be accepted – particularly in response to the atrocities Jews had suffered, who were unwilling to compromise their newfound home against Arab objections to the partitioning of Palestine. Thus, the eventual declaration of the Jewish state of Israel on 14 May 1948, as supported by the United Nations would sow the seeds to the Israeli-Palestinian conflict.

Throughout the last 65 years war, has continuously broken out in the region between Israeli and Arab populations as a result of  the creation of the Jewish state, leading to mass killings and millions of citizens forcibly being expelled or fleeing from their homelands in both Israel and in the Palestinian territories. In 1948 war broke out between Israel and Palestine (backed by Egypt, Jordan, Syria and Iraq), however only a year later Israel defeated its Arab neighbours and in turn, seized further territory within Palestine which were later incorporated into the new Jewish state. Furthermore, despite their defeat the Arab states divided the Palestinian territory amongst themselves, with the Gaza Strip and the West Bank being incorporated by Egypt and Jordan respectively.

Seemingly, this was  an opportunity for Palestine’s neighbours to share the spoils in the newly independent territory. Without the intervention of the United States – who were engaging in a global power struggle with the Soviet Union, and the lacklustre attempts by the United Nations to cease fighting, the future of Palestine was once again determined without legitimate consideration for the Palestinian people. This was compounded by the Tripartite Declaration in 1950, which was a joint statement by the United States, France and Britain which cemented the territorial agreement in Palestine after the war of 1948-49. This was despite the fact that the West would aid Israel economically and militarily during this 1950s, as well the Soviet Union’s own efforts to strengthen the Egyptian army. The Suez Crisis in 1956 also demonstrated Egypt’s own intention to maximise its own strengths in Palestine under the leadership of Nasser, further elucidating the impact of foreign intervention in the Israeli-Palestinian conflict.

Israel would be victorious once again against the Arab states during the Six Day War in 1967, resulting in further territory being captured, including the remaining Arab lands of the West Bank and the Gaza Strip and the incredibly fractious Jerusalem, effectively incorporating the whole of the Palestine in the process. It also demonstrated the  unwillingness of the Arab states to support the Palestinian cause.  The ultimate repercussions would be a strenuous relationship between the Palestinian freedom fighters and its Arab neighbours – signified during the Jordanian-Palestinian War and the Lebanese War. The United States perceived support for Israel, especially during Johnson’s tenure as president has only damaged relations further between Israel and Palestine, and by extension the rest of the Arab world. Significantly, although the peace process began in the 1990s after years of further animosities in the region – giving Palestine greater autonomy in the area in the process, as of yet neither side has been able to co-exist in amity and tensions continue today.

I090203-Jewish-peace-1[1]n conclusion, before we simply jump on to the ‘anti-Israeli’ and ‘Free Palestine’ bandwagon, it is important to assess the impact of foreign powers in solidifying the untenable relationship between Israel and Palestine.  Although it would be foolish to ignore that there are obvious nationalist motivations that have exacerbating the conflict from either side, the actions of foreign nations have only contributed in intensifying the internal differences as a consequence of their egotistical nature of their individual agendas. Ultimately, if we are to solve the enigma of the Israel/Palestine conflict in which both sides can have their nationalist potential realised, the nations of the world must avoid dwelling into the narrow-mindedness that has existed in the resolution process in the past, and give fair consideration to the claims of either side before we can even begin to hope for peace.