Skip to main content

Andre Oboler

  • Dr Andre Oboler is CEO of the Online Hate Prevention Institute, an Australian Harm Prevention Charity working to redu... moreedit
  • Dr Kevin Korb, Dr David Squire, Dr Simon Locke, Prof Ian Sommervilleedit
This book highlights cyber racism as an ever growing contemporary phenomenon. Its scope and impact reveals how the internet has escaped national governments, while its expansion is fuelling the spread of non-state actors. In response, the... more
This book highlights cyber racism as an ever growing contemporary phenomenon. Its scope and impact reveals how the internet has escaped national governments, while its expansion is fuelling the spread of non-state actors. In response, the authors address the central question of this topic: What is to be done?

Cyber Racism and Community Resilience demonstrates how the social sciences can be marshalled to delineate, comprehend and address the issues raised by a global epidemic of hateful acts against race. Authored by an inter-disciplinary team of researchers based in Australia, this book presents original data that reflects upon the lived, complex and often painful reality of race relations on the internet. It engages with the various ways, from the regulatory to the role of social activist, which can be deployed to minimise the harm often felt.

This book will be of particular interest to students and academics in the fields of cybercrime, media sociology and cyber racism.
Research Interests:
This interim report provides some background and preliminary data from OHPI’s forthcoming “Spotlight on Anti-Muslim Internet Hate Report” due to be released in March 2016. It is released for International Human Rights Day (10th of... more
This interim report provides some background and preliminary data from OHPI’s forthcoming “Spotlight on Anti-Muslim Internet Hate Report” due to be released in March 2016. It is released for International Human Rights Day (10th of December 2015) and in light of the need to urgently address this growing problem which threatens the inclusivity of our society and the human dignity of people in our communities.

Anti-Muslim hate has accelerated sharply in 2015. It is based on a misplaced fear of Muslims in general in response to the actions of specific terrorist groups claiming to act in the name of Islam. As a result of the spread of messages of hate online, our values of multiculturalism, religious pluralism, and a fair go for all are being challenged. They are being challenged not only at public rallies, but around the office coffee machine and the water cooler. The messages of hate are being spread around the dinner table, both at home and when eating out in public. The messages of hate which spread online are not staying online. They are shared through social media, and then in person as mobile devices are used to show others memes and anti-Muslim messages. If we can tackle the problem of online hate, we can make a real difference in the spread of hate both online and through society.

This report is based on over 1,100 items of anti-Muslim hate in social media reported and categorised by the public through our FightAgainstHate.com reporting tool. The vast majority of the hate this report is based on was found on Facebook. The report indicates the volume of content by category, and how effective Facebook has been in responding to content in each category. The vast majority of this hate has not yet been removed.
Research Interests:
This report highlights that not enough is being done to combat antisemitism in social media. The report, based on tracking over 2,000 items of antisemitism over the last 10 months, found that only 20% of the items were removed.... more
This report highlights that not enough is being done to combat antisemitism in social media. The report, based on tracking over 2,000 items of antisemitism over the last 10 months, found that only 20% of the items were removed.

Traditional antisemitism made up almost half the sample and covered content such as conspiracy theories, racial slurs, and accusations such as the blood libel. The report also outlines where each type of antisemitism occurs, with content promoting violence against Jews far more likely to be found on Twitter (63% on Twitter, 23% on YouTube and 14% on Facebook), while content promoting Holocaust denial was more likely to be found on YouTube (44% YouTube, 38% Twitter, 18% Facebook).

The report highlights significant variations in the responses of the social media companies to online antisemitism. More significantly, the response by each company was found to vary depending on the nature of the antisemitism.

The best response rates came from Facebook where content promoting violence against Jews has a 75% chance of eventually being removed. The worst case was YouTube videos containing New Antisemitism, that is antisemitism related to the State of Israel, where only 4% has been removed after more than 10 months.
Research Interests:
The new report tracks the response to a number of antisemitic items on Facebook. Some of the items were included in OHPI’s previous report in 2012 into Aboriginal Memes and Online Hate, others are new in 2013. The report shows how some... more
The new report tracks the response to a number of antisemitic items on Facebook. Some of the items were included in OHPI’s previous report in 2012 into Aboriginal Memes and Online Hate, others are new in 2013. The report shows how some items are removed by Facebook while others remain online, some for more than 6 months. The report examines what Facebook removes and what sort of content Facebook does not consider hate speech and refuses to remove. The findings show that Facebook does not really understand antisemitism and has trouble recognizing certain very well known types of antisemitism.

The report shows Facebook has difficulty identifying racism directly based on Nazi propaganda; consistently refusing to recognize as hate speech pages promoting the famous antisemitic forgery used to inspire mass killings, the Protocols of the Elders of Zion; and failing to take action on new antisemitism which uses Holocaust inversion to paint Israel and Jews as Nazis. These blind spots can be added to the known difficulty Facebook has in recognizing Holocaust denial as Hate Speech.
Between June and August 2012 a number of racist images targeting Indigenous Australians began circulating on the internet. The racist content took the form of ‘internet memes’, multi-media messages consisting of an image that contains... more
Between June and August 2012 a number of racist images targeting Indigenous Australians began circulating on the internet. The racist content took the form of ‘internet memes’, multi-media messages consisting of an image that contains both a picture and a text based message. The majority of these images were created using the Meme Generator website. The images were spread through Facebook pages, created for the purpose of bringing together users interested in sharing racist content, where fans were also encouraged to create and share additional images of a similar nature. This report details and discusses the spread of Aboriginal Memes and the response by governments, NGOs and the public. Examples of other forms of hate, particular antisemitism and hate directed against those combating Aboriginal Memes (including the author) are also documented.

The report also makes recommendations to key stakeholders to help reduce online hate as one significant step in the mitigation and prevention of the emotional harm such hate can cause and the physical harm to which such hate can ultimately lead.

The report includes recommendations for key stakeholders. A dialogue is need at both the international level and at the national level between social media companies, governments and civil society. OHPI hopes this report helps to facilitate that discussion.
"The report begins by introducing a taxonomy known as TEMPIS, based on technical capabilities, to the discussion of online hate in general and online antisemitism in particular. The working group recognises that existing technologies are... more
"The report begins by introducing a taxonomy known as TEMPIS, based on technical capabilities, to the discussion of online hate in general and online antisemitism in particular. The working group recognises that existing technologies are rapidly changing, and new technologies continue to emerge. We hope the taxonomy will assist the formation of policies and laws that are general enough to withstand the test of time, yet still specific enough to be practically applied.

In terms of determining what is or is not antisemitism content, the working group endorses the EUMC Working Definition of Antisemitism as a guide for those making decisions at all levels. The definition is maintained by the European Agency on Fundamental Rights,i and was used as the basis for classification in the US State Department’s Report to Congress on Antisemitism.

In this document we broadly list areas of concern by themes and platform (Section A), details of key antisemitic online incidents which proved instructive in understanding the manifestations of online hate (Section B), and various efforts to combat online hate, including reports, conferences and projects (Section C). The document concludes with proposals and recommendations (Section D), and a list of open challenges remaining (Section E)."
The Internet plays a significant role in the spread of information, and misinformation, on the campus and in student communities. Antisemitic conspiracy theories, stereotyping, imagery and motifs are shared and reused around the globe.... more
The Internet plays a significant role in the spread of information, and misinformation, on the campus and in student communities. Antisemitic conspiracy theories, stereotyping, imagery and motifs are shared and reused around the globe. Hateful lies not only spread, but grow. The hate is then expressed in bullying, intimidation, discriminatory policies and occasionally, violent outbursts. The campus, along with the school yard, stands on the front line between a developing culture of hate on the internet and the values of tolerance and multiculturalism that society wishes to instil in youth.
This report represents the work of the participants of the Working Group on Online Antisemitism which met at the Global Forum to Combat Antisemitism in Jerusalem, Israel, in December 2009. The report draws on the expertise, knowledge and... more
This report represents the work of the participants of the Working Group on Online Antisemitism which met at the Global Forum to Combat Antisemitism in Jerusalem, Israel, in December 2009. The report draws on the expertise, knowledge and ideas of experts from around the world, both before, during and after the conference.

The document was created using a Wiki. An initial draft was put together by the chairs, and working group members, and other experts who could not attend the working group, were given editing access. Over a period of one month before the conference the participating edited the document directly through the Wiki software. Periodic versions were also released via e-mail, and suggested changed and additions received via e-mail, to overcome the technology barrier this system created for some experts. In Jerusalem, the working group met in person during two sessions. The document was divided into thematic categories and for each category a short presentation was made, followed by a discussion, review and amendments to the documents content on that theme. At the request of the working group the final text agreed in Jerusalem was returned to the Wiki, with an invitation for members to add additional points over the coming month. These points were then discussed with contributors and editing of the final text was frozen while the document was available review. This document is the final version including contributions agreed in the month following the conference.

Divided into five sections, this report examines both problems and positive developments related to online antisemitism. The report also provides recommendations for Governments, NGOs, the Internet industry, educators, parents and those who wish to take a stand against antisemitism online. Some of the recommendations are simple and immediate, others are ambitious long term goals. As so little work has been undertaken in this area, and society is still catching up with the rapid pace of technology, this report is presented in point form and covers a wide and disparate range of issues.
In the lead-up to ANZAC day 2013 an internet troll began creating a series of pages mocking ANZAC veterans. The pages were designed to cause public outrage in both Australia and New Zealand. The Online Hate Prevention Institute (OHPI)... more
In the lead-up to ANZAC day 2013 an internet troll began creating a series of pages mocking ANZAC veterans. The pages were designed to cause public outrage in both Australia and New Zealand. The Online Hate Prevention Institute (OHPI) documented these hate pages and regularly updated Facebook, law enforcement and relevant government departments. Our efforts, assisted by volunteers, many of them war veterans, ensured the pages were found and closed with minimum impact.

This report documents these pages, Facebook’s response, and the creator’s effort first to cause outrage in the community through a primary attack on something held scared, and then to cause harm to targeted individuals through a secondary attack when the page own impersonated their targets in an effort to steer to public outrage against them.
This chapter provides an in-depth examination of the pathways to resilience, bringing together theoretical models of community development with resilience building and exemplars from a range of situations and countries around the world in... more
This chapter provides an in-depth examination of the pathways to resilience, bringing together theoretical models of community development with resilience building and exemplars from a range of situations and countries around the world in relation to online racism. It explores strategies and responses from within communities experiencing racist harassment, as well as collaborating social activists who provide technical capacities and organisational support. In addition, it identifies how governments and corporates have responded, and their roles in contributing to the enhancement of community resilience.
This chapter provides a framework for building successful online communities that offer solidarity to their members in the face of online racism. The framework looks at the eight different types of online communities, the types of... more
This chapter provides a framework for building successful online communities that offer solidarity to their members in the face of online racism. The framework looks at the eight different types of online communities, the types of stakeholders driving or empowering them and a range of proactive and reactive strategies they can adopt to tackle cyber racism. This framework is illustrated with in-depth case studies and examples of each type of community and how they can contribute to the creation of online communities of solidarity and resistance.
This chapter lays out the broad political economy of race and the Internet. It explores the emergence of the transnational super-corporations within whose structures and through whose products and services racism occurs. It explores how... more
This chapter lays out the broad political economy of race and the Internet. It explores the emergence of the transnational super-corporations within whose structures and through whose products and services racism occurs. It explores how processes of regulation form, are resisted and transform. It also looks at how the cyber world has changed since the major studies undertaken in the first decade of the century have been overtaken by new technologies, new questions of regulation and new environments of racialised conflict and racial empowerment.
Research Report. La Trobe LawTech comments on the Australian Government proposal for a new Online Safety Act to improve Australia's online safety regulatory framework (Consultation Consultation Period: December 11, 2019 to February... more
Research Report. La Trobe LawTech comments on the Australian Government proposal for a new Online Safety Act to improve Australia's online safety regulatory framework (Consultation Consultation Period: December 11, 2019 to February 19, 2020)
It is usually said that technical solutions should operate ethically, in compliance with the law and subject to good governance principles. In this position paper we face the problem of behavioural compliance and law enforcement in the... more
It is usually said that technical solutions should operate ethically, in compliance with the law and subject to good governance principles. In this position paper we face the problem of behavioural compliance and law enforcement in the case of hate and fear speech online. Law enforcement and behavioural compliance are ways of coping with the objective of stopping hate online. We contend that a combination of regulatory instruments, incentives, training, proactive selfawareness and education can be effective to create legal ecosystems to improve the present situation.
This chapter reports on the framework of legal and regulatory channels in place to deal with cyber racism (with specific reference to Australia), to identify how this issue can be tackled more effectively. It offers insights on how to... more
This chapter reports on the framework of legal and regulatory channels in place to deal with cyber racism (with specific reference to Australia), to identify how this issue can be tackled more effectively. It offers insights on how to approach the issue of regulation in the future and argues for the strengthening of administrative remedies over criminalisation and strategies for promoting ethical behaviour online, an awareness of human rights and prevention over prosecution.
In the creative environment where research takes place not everything can be improved. The creative "essence" of research must be undisturbed while "accident" wasted effort is reduced to a minimum. In this paper we... more
In the creative environment where research takes place not everything can be improved. The creative "essence" of research must be undisturbed while "accident" wasted effort is reduced to a minimum. In this paper we discuss the types of knowledge at play in the research environment, introduce a new abstract model of knowledge, and using the model explain how we should focus our effort on research students and on particular types of knowledge transfer in order to gain an over all improvement in our research processes. Just as we teach to facilitate student learning, so too can we supervise, teach and guide to facilitate better and faster researching in our academic computer science departments.
No description supplied
This chapter shows how the Internet has become a dangerous place for encounters with racism. It analyses how racists behave online, recognising that not all racism has been constructed by people with overt racist agendas or a conscious... more
This chapter shows how the Internet has become a dangerous place for encounters with racism. It analyses how racists behave online, recognising that not all racism has been constructed by people with overt racist agendas or a conscious sense of antipathy to other races or ethnic groups. It also identifies a range of elements that constitute racist behaviour online and examines situations where racism is experienced by targets, though perpetrators may dispute that they are racist.
This thesis presents a new process based approach to software engineering designed to meet the needs of academic computer science researchers. The core objective was to examine whether software engineering approaches could be adapted for... more
This thesis presents a new process based approach to software engineering designed to meet the needs of academic computer science researchers. The core objective was to examine whether software engineering approaches could be adapted for the research environment so that they gain acceptance and enable improvement of the research process. The approach included the provision of selected process descriptors (software, tools, and guidelines) to support research. Inter-researchers support, and support between researchers and a departmental Software Engineer were examined. A new working paradigm for computer science departments was simulated in our experiments; this involves the integration of a software engineer working on projects across the department. The approach encourages researcher’s reflection and conscious engagement with their research process. For researchers, our approach provides novel ways of documenting the research and research process and sharing this information in a lo...
Viral Hate: Containing Its Spread on the Internet. By Abraham H. Foxman and Christopher Wolf. Palgrave Macmillan, 2013. 256 pages. Hardback $19.
The Cyber-Racism and Community Resilience (CRaCR) project included an examination into features of online communities of resistance and solidarity. This work formed a key part of the project’s focus on resilience and produced a deeper... more
The Cyber-Racism and Community Resilience (CRaCR) project included an examination into features of online communities of resistance and solidarity. This work formed a key part of the project’s focus on resilience and produced a deeper understanding of a range of types of actors working in this space and how they might individually contribute effectively to creating resilience. The need for new synergies between different types of stakeholders and approaches was highlighted as an area of future work. This paper explores a design for that future work that builds and supports online communities of resistance and solidarity by drawing on the lessons from the earlier research and extending them. This new work both presents a model for cooperation and explains how different stakeholders can positively engage under the model in a smarter way. That is, through a system which facilitates Solidarity in Moving Against Racism Together while Enabling Resilience. This new approach draws on the st...
The application of legal doctrines to the problem of online hate speech, particularly in social media, is of growing importance to both the legal profession and society. In just a few years, social media has become a mainstream form of... more
The application of legal doctrines to the problem of online hate speech, particularly in social media, is of growing importance to both the legal profession and society. In just a few years, social media has become a mainstream form of communication and opened mass communication to the public. It has greatly increased both the ability of individuals to communicate and the impact of those communications. In Australia, and internationally, the law is trying to catch up. In this paper we begin with a consideration of cybercrime, extracting principles and ideas from the literature. This includes the key idea of ‘online / offline consistency’, and the exception that applies when online conduct is more prevalent, or changes in nature, from its offline counterpart. Next we consider the general nature of hate speech and arguments in favour of its criminalisation, as well as the specific nature of online hate speech in social media. With this as background we consider hate speech provisions in four Australian jurisdictions, comparing them to an international standard set out in the Additional Protocol. The paper concludes by highlighting the need for international consistency, the usefulness of the Additional Protocol to achieve this, and the suitability of Western Australia’s approach to hate crime as a means to achieve this within Australia.
• Virtual Israel, as represented by Google Earth, is littered with orange dots, many of which claim to represent "Palestinian localities evacuated and destroyed after the 1948 Arab-Israeli war." Thus, Israel is depicted... more
• Virtual Israel, as represented by Google Earth, is littered with orange dots, many of which claim to represent "Palestinian localities evacuated and destroyed after the 1948 Arab-Israeli war." Thus, Israel is depicted as a state born out of colonial conquest rather than the return of a ...
Facebook pages can provide a home for racism and facilitate the creation of new virtual communities based on hate of specific minorities, or of everyone who is unlike themselves. Facebook pages can also serve as archives for hateful... more
Facebook pages can provide a home for racism and facilitate the creation of new virtual communities based on hate of specific minorities, or of everyone who is unlike themselves. Facebook pages can also serve as archives for hateful content that can be easily found, shared, and spread. Hate pages on Facebook pose a danger to the social cohesion of society and due to their low entry barrier, they allows racism and hate to spread through society more easily. This report focuses on Facebook, on antisemitic content and on its availability in Australia. It has been compiled by the Online Hate Prevention Institute to support the work of the Executive Council of Australian Jewry, the peak Jewish community organisation in Australia. OHPI is happy to provide similar assistance to other peak bodies whose communities are subject to online attack. We hope this report is useful not only to the Jewish community, but also as an example for other communities who may feel under siege in the digital ...
Research Interests:
"In the lead-up to ANZAC day 2013 an internet troll began creating a series of pages mocking ANZAC veterans. The pages were designed to cause public outrage in both Australia and New Zealand. The Online Hate Prevention... more
"In the lead-up to ANZAC day 2013 an internet troll began creating a series of pages mocking ANZAC veterans. The pages were designed to cause public outrage in both Australia and New Zealand. The Online Hate Prevention Institute (OHPI) documented these hate pages and regularly updated Facebook, law enforcement and relevant government departments. Our efforts, assisted by volunteers, many of them war veterans, ensured the pages were found and closed with minimum impact. This report documents these pages, Facebook’s response, and the creator’s effort first to cause outrage in the community through a primary attack on something held scared, and then to cause harm to targeted individuals through a secondary attack when the page own impersonated their targets in an effort to steer to public outrage against them."
Research Interests:
The Super Iterator pattern, like the standard Iterator pattern, traverses an unknown data structure without exposing that structure. With the standard Iterator pattern, clients must create a different iterator for each new structure, and... more
The Super Iterator pattern, like the standard Iterator pattern, traverses an unknown data structure without exposing that structure. With the standard Iterator pattern, clients must create a different iterator for each new structure, and the object returned must be of the specific type stored in the structure, even when they share a common super class. With the Super Iterator pattern, the object returned is of the common super class, and the iterator itself need not be altered when adding a new subtype with custom data structures. The client, however, must change two lines of code to load and instantiate the new subclass.
Research Interests:
Social networking Web sites are amassing vast quantities of data and computational social science is providing tools to process this data. The combination of these two factors has significant implications for individuals and society. With... more
Social networking Web sites are amassing vast quantities of data and computational social science is providing tools to process this data. The combination of these two factors has significant implications for individuals and society. With announcements of growing data aggregation by both Google and Facebook, the need for consideration of these issues is becoming urgent. Just as Web 2.0 platforms put publishing in the hands of the masses, without adequate safeguards, computational social science may make surveillance, profiling, and targeting overly accessible. The academic study of computational social science explains the field as an interdisciplinary investigation of the social dynamics of society with the aid of advanced computational systems. Such investigation can operate at the macro level of global attitudes and trends, down to the personal level of an individual’s psychology. This paper uses the lenses of computation social science to consider the uses and dangers that may result from the data aggregation social media companies are perusing. We also consider the role ethics and regulation may play in protecting the public.
The first part of this chapter uses the Australian example to demonstrate the development of contrasting national identity narratives and their interpretation on social media. This includes contextualising the narratives in terms of the... more
The first part of this chapter uses the Australian example to demonstrate the development of contrasting national identity narratives and their interpretation on social media. This includes contextualising the narratives in terms of the historical and political development of Australia as a multicultural nation. The second part of this chapter uses examples from Australia and around the world to explore the discursive strategies employed by exponents of cyber racism to promote their version of national identity narratives. These complementary approaches aim to give insight into the dynamic through which cyber racism can be legitimised through narratives about national identity.
The Super Iterator pattern, like the standard Iterator pattern, traverses an unknown data structure without exposing that structure. With the standard Iterator pattern, clients must create a different iterator for each new structure, and... more
The Super Iterator pattern, like the standard Iterator pattern, traverses an unknown data structure without exposing that structure. With the standard Iterator pattern, clients must create a different iterator for each new structure, and the object returned must be of the specific type stored in the structure, even when they share a common super class. With the Super Iterator pattern, the object returned is of the common super class, and the iterator itself need not be altered when adding a new subtype with custom data structures. The client, however, must change two lines of code to load and instantiate the new subclass.
Research Interests:
Abstract In the creative environment where research takes place not everything can be improved. The creative" essence" of research must be undisturbed while" accident" wasted effort is reduced to a... more
Abstract In the creative environment where research takes place not everything can be improved. The creative" essence" of research must be undisturbed while" accident" wasted effort is reduced to a minimum. In this paper we discuss the types of knowledge at play in ...
The application of legal doctrines to the problem of online hate speech, particularly in social media, is of growing importance to both the legal profession and society. In just a few years, social media has become a mainstream form of... more
The application of legal doctrines to the problem of
online hate speech, particularly in social media, is of
growing importance to both the legal profession and
society. In just a few years, social media has become a
mainstream form of communication and opened mass
communication to the public. It has greatly increased
both the ability of individuals to communicate and the
impact of those communications. In Australia, and
internationally, the law is trying to catch up.
In this paper we begin with a consideration of
cybercrime, extracting principles and ideas from the
literature. This includes the key idea of ‘online / offline
consistency’, and the exception that applies when online
conduct is more prevalent, or changes in nature, from its
offline counterpart. Next we consider the general nature
of hate speech and arguments in favour of its
criminalisation, as well as the specific nature of online
hate speech in social media. With this as background we
consider hate speech provisions in four Australian
jurisdictions, comparing them to an international
standard set out in the Additional Protocol.

The paper concludes by highlighting the need for
international consistency, the usefulness of the
Additional Protocol to achieve this, and the suitability of
Western Australia’s approach to hate crime as a means to
achieve this within Australia.
The attacks in January 2015 in France on the satirical publication Charlie Hebdo, a police officer, and a kosher supermarket, have sparked significant discussion. That discussion touches on issues of antisemitism, freedom of speech and... more
The attacks in January 2015 in France on the satirical publication Charlie Hebdo, a police officer, and a kosher supermarket, have sparked significant discussion. That discussion touches on issues of antisemitism, freedom of speech and expression, a free press, freedom from persecution, human dignity, bigotry, religion, blasphemy, self-censorship, and government censorship. Social media is enabling the masses to come together, to mourn, and to debate the placement of the often fine line between freedom of expression and respect for human dignity. This report highlights the antisemitic aspects of the attacks in France and the need for a greater response, by both governments and society, to a problem which serves as a significant predictor of a breakdown of civil rights. Rising antisemitism in Europe, particularly within the Muslim community, has not been sufficiently tackled in recent years and this puts society as a whole at risk. The report explores the different approaches to the ...
The Antisemitic Meme of the Jew is a cartoon picture depicting a negative stereotype of a Jewish man with a black beard, long hooked nose, a hunched back, crooked teeth, and hands being wrung in glee. The image was created by a white... more
The Antisemitic Meme of the Jew is a cartoon picture depicting a negative stereotype of a Jewish man with a black beard, long hooked nose, a hunched back, crooked teeth, and hands being wrung in glee. The image was created by a white supremacist cartoonist and has been online in neo-Nazi circles since at least 2004. This report highlights an effort to give the image public acceptance as a part of mainstream online culture. This would increase the acceptability of using the image and significantly contribute to further normalising antisemitism in online society. It would help take the racist portrayal of Jews from the neo-Nazi fringe into the mainstream. This report provides recommendations to help prevent that occurring. The effort to gain public acceptance for the image take place in three parts. The first push to gain acceptance for the image took the form of an effort to have the image entered as a recognised meme on the “know your meme” website, an authority on the online cultur...
In the creative environment where research takes place not everything can be improved. The creative "essence" of research must be undisturbed while "accident" wasted effort is reduced to a minimum. In this paper we... more
In the creative environment where research takes place not everything can be improved. The creative "essence" of research must be undisturbed while "accident" wasted effort is reduced to a minimum. In this paper we discuss the types of knowledge at play in the research environment, introduce a new abstract model of knowledge, and using the model explain how we should focus our effort on research students and on particular types of knowledge transfer in order to gain an over all improvement in our research processes. Just as we teach to facilitate student learning, so too can we supervise, teach and guide to facilitate better and faster researching in our academic computer science departments.
Our initial intervention included the introduction of a documentation style for software coding, this was largely about the approach to commenting and the specific type of data that should be captured for a research project. We also... more
Our initial intervention included the introduction of a documentation style for software coding, this was largely about the approach to commenting and the specific type of data that should be captured for a research project. We also introduced the use of a software tool to extract ...
Social networking Web sites are amassing vast quantities of data and computational social science is providing tools to process this data. The combination of these two factors has significant implications for individuals and society. With... more
Social networking Web sites are amassing vast quantities of data and computational social science is providing tools to process this data. The combination of these two factors has significant implications for individuals and society. With announcements of growing data aggregation by both Google and Facebook, the need for consideration of these issues is becoming urgent. Just as Web 2.0 platforms put publishing in the hands of the masses, without adequate safeguards, computational social science may make surveillance, profiling, and targeting overly accessible.

The academic study of computational social science explains the field as an interdisciplinary investigation of the social dynamics of society with the aid of advanced computational systems. Such investigation can operate at the macro level of global attitudes and trends, down to the personal level of an individual’s psychology. This paper uses the lenses of computation social science to consider the uses and dangers that may result from the data aggregation social media companies are perusing. We also consider the role ethics and regulation may play in protecting the public.
In May 2009, Facebook went into damage control in response to the media interest in Holocaust-denial groups it hosted. This occurred six months after Facebook was notified that such groups not only breached its Terms of Service but were... more
In May 2009, Facebook went into damage control in response to the media interest in Holocaust-denial groups it hosted. This occurred six months after Facebook was notified that such groups not only breached its Terms of Service but were illegal under national laws banning Holocaust denial in several countries.
Between receiving the complaints and responding to the media interest, Facebook rolled out new terms of use. These removed the explicit ban on content that is “harmful,” “defamatory,” “abusive,” “inflammatory,” “vulgar,” “obscene,” “fraudulent,” “invasive of privacy or publicity rights,” or “racially, ethnically or otherwise objectionable.” The reference to local, regional, and national laws also vanished.
Facebook’s eventual response, defending the posting of Holocaust denial, highlighted a dramatic change in direction for a company that once sought to provide a “safe place on the internet” and stated that “certain kinds of speech simply do not belong in a community like Facebook.” Facebook has through ignorance created an anti-Semitic policy platform where the only explicitly allowed hate is that, within certain parameters, directed against Jews.
Holocaust-denial groups should be removed from Facebook because Holocaust denial is a form of anti-Semitism. Such content represents a clear expression of hate and is therefore inconsistent with basic standards of decency and even Facebook’s new Terms of Service. Holocaust denial also constitutes a threat to the safety of the Jewish community. Such a ban would not be inconsistent with First Amendment rights in the United States, and would be wholly consistent with hate speech bans that exist in much of Europe.
Research Interests:
Racial hate propaganda is unlawful in Australia, and this extends to non-private online communications. This may create liabilities for technology companies. International discussions have highlighted the need for both national and... more
Racial hate propaganda is unlawful in Australia, and this extends to non-private online communications. This may create liabilities for technology companies.

International discussions have highlighted the need for both national and international engagement on the problem of online racism. More active government involvement is inevitable in the future and poses a manageable risk to technology companies.

The Copyright Act 1968 (Cth) provides a model for technology-based remedies to unlawful acts that take place online. This could serve as a template for remedies to other types of unlawful acts, including the spread of online hate propaganda.
    *
The Attorney General’s announcement of a possible extension of “safe harbour” provisions in the Copyright Act to a larger range of service providers raises the questions of similar provisions for other unlawful activity facilitated by these providers.

Lawyers advising clients who provide non-private online spaces should consider a range of legal developments in other areas, and should consider how similar provisions in the area of online hate may affect their clients. Engineering solutions to limit risk are possible and could be integrated into future development if considered pre-emptively.
The Super Iterator pattern, like the standard Iterator pattern, traverses an unknown data structure without exposing that structure. With the standard Iterator pattern, clients must create a different iterator for each new structure, and... more
The Super Iterator pattern, like the standard Iterator pattern, traverses an unknown data structure without exposing that structure. With the standard Iterator pattern, clients must create a different iterator for each new structure, and the object returned must be of the specific type stored in the structure, even when they share a common super class. With the Super Iterator pattern, the object returned is of the common super class, and the iterator itself need not be altered when adding a new subtype with custom data structures. The client, however, must change two lines of code to load and instantiate the new subclass.
This paper introduces criticism elimination, a type of information removal leading to a framing effect that impairs Wikipedia‟s delivery of a Neutral Point of View (NPOV) and ultimately facilitates a new form of gatekeeping with political... more
This paper introduces criticism elimination, a type of information removal leading to a framing effect that impairs Wikipedia‟s delivery of a Neutral Point of View (NPOV) and ultimately facilitates a new form of gatekeeping with political science and information technology implications. This paper demonstrates a systematic use of criticism elimination and categorizes the editors responsible into four types. We show some types use criticism elimination to dominated and manipulated articles to advocate political and ideological agendas. We suggest mitigation approaches to criticism elimination. The research is
interdisciplinary and based on empirical analysis of the public edit histories.

And 4 more