Commercial Sexual Exploitation and the Sexual Abuse-to-Prison Pipeline: A Deeper Look

Introduction and Literature Review

Childhood sexual abuse is widely recognized as one of the most deleterious and injurious forms of abuse. CSA is found to have a significant impact on the psychological, social, emotional, and spiritual development of an individual. There is much literature examining the risk factors, symptomatology, and interventions for this population (Briggs et al., 1997; Fergusson et al., 2008; Rumstein-McKean, 2001). One emerging outcome of CSA is the commercial sexual exploitation of children. There is limited but growing research in this are as political and societal views on this issue shift. This paper will examine the broad topic of childhood sexual abuse with a focus on commercial sexual exploitation of children as an outcome, and the gaps in research in the area of CSEC.

I. Outcomes for Female Survivors of Childhood Sexual Abuse

The Encyclopedia of Psychology defines sexual abuse as, “unwanted sexual activity, with perpetrators using force, making threats or taking advantage of victims not able to give consent. Most victims and perpetrators know each other. Immediate reactions to sexual abuse include shock, fear or disbelief. Long-term symptoms include anxiety, fear or post-traumatic stress disorder” (Encyclopedia of Psychology, as cited in “Sexual abuse,” n.d.). Sexual abuse that occurs during childhood is often identified as a particularly harmful type of abuse, which can impact a child’s psychological, emotional and cognitive development.

Get Help With Your Essay
If you need assistance with writing your essay, our professional essay writing service is here to help!
Essay Writing Service

Childhood sexual abuse can severely impact an individual’s development and functioning in a myriad of ways. CSA has long been linked to a number of mental health concerns. A seminal longitudinal study, following a cohort of 1,000 young adults in New Zealand, found that those who had experienced childhood sexual abuse, and particularly very severe abuse, were at a higher risk for a number of psychiatric disorders than their peers. The study found that those who had experienced CSA were 2.7-11.9 times more likely to present with at least one psychiatric disorder, including mood, anxiety, conduct, and substance use disorders (Fergusson et al., 2008, p.1372). A second New Zealand study sampled 73 woman receiving treatment from a sexual abuse counseling program. All participants had experienced some form of CSA. Of these participants, a majority exhibited symptomatology consistent with post-traumatic stress disorder. The study also found that higher rates and intensity of reported symptomatology correlated with higher levels of general psychopathology. The extent and severity of the abuse, particularly the involvement of intercouse, had a strong correlation with the severity of symptoms (Briggs & Joyce, 1997, p. 579).

In addition to mental health concerns, childhood sexual abuse also deeply impacts psychosocial development and functioning. Rumstein-McKean and Hunsley (2001) found a link between CSA and a number of interpersonal deficits, including challenges in relationship intimacy, sexual functioning, marital functioning, and attachment (Rumstein-McKean & Hunsley, 2001). CSA is also linked to intergenerational attachment challenges for survivors who become mothers. One study found that mothers who were survivors of CSA were more likely to experience insecure and anxious forms of attachment, though it should be noted that low socioeconomic status was a potential confound for anxious attachment. Additionally, children with mothers who experienced CSA were more likely to exhibit acute self-protective strategies, including coercive anger or desire for comfort (Kwako, Noll, Putnam, & Trickett, 2010).

One study looked at the effects of CSA on educational achievement, which was measured by meeting high school academic requirements, graduating high school, pursuing higher education, and completing a degree. CSA was significantly correlated to the underachievement of meeting these educational milestones. However, covariate factors included SES, parental adjustment, family functioning, and individual characteristics (Boden, Horwood, & Fergusson, 2007).  

Despite the overwhelmingly negative prognosis for CSA survivors, there is a common experience for those who are able to recover. Noll (2008) examines the distinguishing features of childhood sexual abuse on development, as compared to other forms of abuse. Noll found that one unique characteristic of CSA survivors was their resilience in being able to leverage their social, economic and personal resources as a way towards recovery (Noll, 2008).

II. CSEC as an outcome of CSA

An emerging outcome issue linked to CSA is the commercial sexual exploitation of children (CSEC). A special report by the U.S. Department of Justice defines CSEC as, “sexual abuse of a minor for economic gain. It involves physical abuse, pornography, prostitution, and the smuggling of children of unlawful purposes”. The DOJ identifies three types of organization of CSEC, which include local exploitation by one or a few person(s), regional networks that involves the exploitation of several children and additional criminal activity, and national or international sex crime networks (U.S. Department of Justice, 2008).

  An in-depth collaborative report conducted by the Human Rights Project for Girls, the Georgetown Law Center on poverty and equality, and the Ms. Foundation for Girls identifies the link between sexual abuse among adolescent girls and their involvement in the juvenile and criminal justice systems. Those particularly at risk are girls of color, particularly African American and Native American girls, and LGBT youth. Both populations are overrepresented in the juvenile justice system. Even more troublesome is that these girls experiencing CSEC and trafficking are often criminalized as “prostitutes”, instead of being identified as part of vulnerable and exploited populations. In a prison setting, ill-equipped to address issues of complex trauma, these are retraumatized, leading to a cycle of “victimization-to-imprisonment for marginalized girls”, also identified as the “sexual abuse-to-prison pipeline” (Saar, Epstein, Rosenthal, & Vafa, 2015).

This terminology is an outgrowth of the term “school-to-prison pipeline”, which has traditionally been used to recognize the relationship between “zero tolerance behavioral policies” in schools and the systemic and systematic incarceration of low-income boys of color (McCarter, 2016). Further research has shown the unique characteristics in the circumstances surrounding the entrance of other marginalized youth into the juvenile justice system. Research has shown that minority youth including LGBT youth and youth with disabilities are disproportionately represented in the juvenile system (McCarter, 2016). Many LGBT youth are over-monitored and policed, especially when they don’t conform to gender norms. Additionally, LGBT youth are given unequal punishment and lack social and emotional support, which further contributes to victimization (Snapp, Hoenig, Fields, & Russell, 2015). Unfortunately, for girls, entrance into the juvenile justice system is linked to childhood sexual abuse and commercial sexual exploitation as children (Saat et al., 2015).

III. Effective Intervention for CSA and CSEC

The effects of childhood sexual abuse and commercial sexual exploitation of children is complex and therefore the effectiveness of interventions and treatment are varied. One form of treatment found to be effective is group therapy. One study examined 14 female prisoners who had experienced some form of sexual violence. The intervention focused on therapeutic techniques centered around exposure. The treatment was found to be effective in decreasing clinically significant depressive and GAD symptoms. Many women also self-reported to have recovered, which was measured by reduction of symptoms that placed women below the threshold for post-treatment depression (Karlsson, Bridges, Bell, & Petretic, 2014).  

Other treatments that have been found to be effective include interventions that focus on the cognitive process, such as cognitive-behavioral therapy and other forms of psychotherapy (Saars et al., 2015). However, there is limited research on the interventions that address the particular concerns of CSEC.

IV. Future areas of study

Childhood sexual abuse is a topic that has been thoroughly and systematically researched. However, there are many gaps in research in the area of the commercial sexual exploitation of children. One particular need is exploring effective treatment options that have been shown to reduce clinical symptoms that are uniquely related to CSEC. One study looking at CSEC survivors in residential and group home settings, was the first study to systematically look at particularly needs of CSEC survivors compared to other youth engaged with the child welfare and juvenile justice systems. This study identified that one of the barriers to identifying effective and generalizable interventions for CSEC are state policies around the issue. The extent of services and manner by which CSEC is addressed largely related to a state’s protection or criminalization of this vulnerable population. There are still several states that have no legal protections, such as “safe harbour” laws for this population (Hickle & Roe-Sepowitz, 2018).

Broadly, there are general gaps in research systematically examining the risk factors, symptoms and effects, and interventions for this population. Though research in this area has been growing within the last decade, there are still many related issues that have gone unexamined. This is due, in part, to public and societal perception and knowledge around the topic of the commercial sexual exploitation of children. Social Work research is a discipline that should not fall under the same deleterious notions, whether consciously or otherwise.  

Purpose of the Study

The purpose of the study is to complete an exploratory analysis of the unique risk factors, symptomatology and interventions available for young adult survivors of commercial sexual exploitation, age 18-30, who retrospectively examine their experience. The study will focus on the perception and narrative of survivors. Intersecting aspects of race, socioeconomic status, and family functioning will also be considered.

References

Boden, J. M., Horwood, L. J., & Fergusson, D. M. (2007). Exposure to childhood sexual and physical abuse and subsequent educational achievement outcomes. Child Abuse & Neglect, 31(10), 1101-1114. https://doi.org/10.1016/j.chiabu.2007.03.022

Briggs, L., & Joyce, P. R. (1997). What determines post-traumatic stress disorder symptomatology for survivors of childhood sexual abuse?. Child Abuse & Neglect, 21(6), 575-582. https://doi.org/10.1016/S0145-2134(97)00014-8

Department of Justice. (2008). Commercial sexual exploitation of children: What do we know and what do we do about it? Washington, D.C.: U.S. Department of Justice Office, Office of Justice Programs. Retrieved from: https://babel.hathitrust.org/cgi/pt?id=pur1.32754075508089;view=1up;seq=1

Fergusson, D. M., Horwood, L. J., & Lynskey, M. T. (1996). Childhood sexual abuse and psychiatric disorder young adulthood: II. Psychiatric outcomes of childhood sexual abuse. Journal of the American Academy of Child and Adolescent Psychiatry, 35(10), 1355-1364. https://doi.org/10.1097/00004583-199610000-00023

Hickle, K., & Roe-Sepowitz, D. (2018). Adversity and intervention needs among girls in residential care with experiences of commercial sexual exploitation. Children and Youth Services Review, 93, 17-23. https://doi.org/10.1016/j.childyouth.2018.06.043

Karlsson, M. E., Bridges, A. J., Bell, J., & Petretic, P. (2014). Sexual violence therapy group in a women’s correctional facility: A preliminary evaluation. Journal of Traumatic Stress, 27, 361-364.

Kwako, L. E., Noll, J. G., Putnam, F. W., & Trickett, P. K. (2010). Childhood sexual abuse and attachment: An intergenerational perspective. Clinical Child Psychology and Psychiatry, 15(3), 407-422. https://doi-org.csulb.idm.oclc.org/10.1177/1359104510367590

McCarter, S. (2017). The school-to-prison pipeline: A primer for social workers. Social Work, 62(1), 53-61. http://dx.doi.org.csulb.idm.oclc.org/10.1093/sw/sww078

Rumstein-McKean, O., & Hunsley, J. (2001). Interpersonal and family functioning of female survivors of childhood sexual abuse. Clinical Psychology Review, 21(5), 471-490. https://doi.org/10.1016/S0272-7358(99)00069-0

Saar, M. S., Epstein, R., Rosenthal, L., & Vafa, Y. (2015). The sexual abuse to prison pipeline: The girls’ story. Retrieved from Georgetown Law Center on Poverty and Inequality: http://rights4girls.org/wp-content/uploads/r4g/2015/02/2015_COP_sexual-abuse_layout_web-1.pdf

Sexual abuse. (n.d.). Retrieved September 29, 2018, from https://www.apa.org/topics/sexual-abuse/index.aspx

Snapp, S. D., Hoenig, J. M., Fields, A., & Russell, S. T. (2015). Messy, Butch, and Queer: LGBTQ youth and the school-to-prison pipeline. Journal of Adolescent Research, 30(1), 57-82.https://doi-org.csulb.idm.oclc.org/10.1177/0743558414557625

 

CI and CD in AWS Code Pipeline

 

Abstract

Software firms are increasingly using and contributing to DevOps software. Many companies now use pipelines for better efficiency. To do this they need to understand and work with pipeline resources and processes. This paper will investigate the continuous Integration (CI) and continuous delivery (CD) in AWS code pipeline and the impact it has on building, testing and deploying. AWS code pipeline will connect to a GitHub repo. AWS code pipeline allows for automated rapid delivery, easy integration, configurable workflow which allows integration with third party tools like GitHub, Jenkins and Docker. AWS code pipeline uses IAM to manage who can make changes to pipeline. Parallel execution can be used in AWS code pipeline to model a build, test, and deployment to run parallel to increase your workflow speeds.

General Terms

Pipeline, CI and CD

Keywords

AWS Code Pipeline, Continuous Integration and continuous delivery

1.       Introduction

This paper is an introduction on how to use Amazon Web Services (AWS) in DevOps. AWS resources can decrease time to market and reduce costs for companies. The paper will discuss a specific tool called AWS code pipeline, it is a relatively new tool which was released in July 2015. AWS code pipeline is a continuous integration, continuous delivery and continuous deployment service. It automatically builds, tests and deploys applications and services into the cloud which allows for less risk of errors etc.

Get Help With Your Essay
If you need assistance with writing your essay, our professional essay writing service is here to help!
Essay Writing Service

In today’s world applications must evolve quickly for customers. Improving and releasing software at a fast pace to customer needs to be at the core of every business. Making time and agility to market essential to maintaining your competitive advantage. Companies that can rapidly deliver updates to applications and services, innovate and change faster to adapting changing markets which give better results to business and customers.

 With AWS code pipeline, companies can deliver value to their customers quickly and safely.

 

2.       Continous Integaration (ci)

“Continuous Integration doesn’t get rid of bugs, but it does make them dramatically easier to find and remove.” [1] Continuous integration is a widely established coding philosophy and set of practices that drive development teams to create small changes and to check code in repositories frequently. Most applications require developing code in different platforms, systems and tools. Development teams need a mechanism to integrate and validate its changes for continuous integration to work [2].

                                                                                            The goal of continuous integration is to establish a consistent and automated way to code, build, and test applications. With consistency and efficiency in the integration process, teams are more likely to commit code changes more frequently, which leads to better collaboration and software quality [2].

Continuous integration is based on several key principles and practices:

Maintain a single repository

Automate the build

Make the build self-testing

Every commit should build on an integration machine

Keep the build fast

Test the production environment in a clone

Easy for anyone to get the latest executable version

Everyone can see what’s happening in the repository

Automate deployment

Some of the costs of using continuous integration are time investment and the quality and quantity of tests. Building an automated test requires a lot of work.  If a developer is on a small team, they may not have the time to invest in this sort of infrastructure. Stability of tests may be messy due to timing issues or errors and tests themselves may be hard to write [3].

Setting up a continuous integration system might be daunting, it ultimately reduces risk, cost, and time for rework. If a developer introduces a bug and detects it quickly, it’s far easier to get rid of [3][5]. Since you’ve only changed a small bit of the system, you don’t have to look back far and only a small number of changes are lost. This results in much more frequent automated testing and releases.

Continues integration also improves collaboration and quality. It does this in many ways with many different tools.

Slack: Slack is a cloud-based set of proprietary team collaboration tools and services. Can link to GitHub.

Asana:A web and mobile application designed to help and improve collaboration. Teams organize, track, and manage their work. Asana simplifies team-based work management. Also links to GitHub

3.       Continous delivery (cd) AWS code pipeline

Continuous delivery picks up where continuous integration ends. Continuous delivery automates the delivery of applications to selected infrastructure environments. Continuous delivery (CD) is a software engineering approach in which teams produce software in short cycles, ensuring that the software can be reliably released at any time. It aims at building, testing, and releasing software with greater speed and frequency.CD can quickly iterate on feedback and get new features to users faster.

A typical CD pipeline includes many of these steps: [2]

Pulling code from version control and executing a build.

Executing required infrastructure steps that are automated as code to stand up or tear down cloud infrastructure.

Moving code to the target compute environment.

Managing the environment variables and configuring them for the target environment.

Pushing application components to their appropriate services, such as web servers, API services, and database services.

Executing any steps required to restarts services or call service endpoints that are needed for new code pushes.

Executing continuous tests and rollback environments if tests fail.

Providing log data and alerts on the state of the delivery.

Continuous delivery has four main benefits:

Continuous delivery automates the software release process by letting teams automatically code, build, test, and prepare for release to production so that your software delivery is more frequent and efficient.

Continuous delivery helps teams be more productive by freeing developers from manual tasks and encouraging behaviors that help reduce the number of errors and bugs deployed to customers.

Continuous delivery allows teams to more easily perform additional types of tests on your code because the entire process has been automated.

Continuous delivery helps teams deliver updates to customers faster and more frequently. When continuous delivery is implemented properly, you will always have a deployment-ready build artifact that has passed through a standardized test process.

4.       AWS code pipeline

AWS Code Pipeline is a automated continuous integration and continuous delivery service that enables developers to model, visualize, and automate the steps required to release your software. With AWS Code Pipeline, teams model the full release process for building your code, deploying to pre-production environments, testing your application and releasing it to production. AWS Code Pipeline then builds, tests, and deploys your application according to the defined workflow every time there is a code change. You can integrate partner tools and your own custom tools into any stage of the release process to form an end-to-end continuous delivery solution.

4.1 Why should Teams use AWS code pipeline

By automating your build, test, and release processes, AWS Code Pipeline enables you to increase the speed and quality of your software updates by running all new changes through a consistent set of quality checks. AWS Code Pipeline also allows teams to work on their continuous integration and continuous delivery skills.

4.2 Pipeline Concepts

          

Figure 1: Aws code pipeline concept

A pipeline is a workflow that describes how software changes go through a release process using continuous delivery. You then define the workflow with a sequence of stages and actions.

A revision is a change made to the source location in your pipeline. It can include source code, build output, configuration, or data. A pipeline can have multiple revisions flowing through it at the same time.

A stage is a group of one or more actions. A pipeline can have two or more stages. All stages must have unique name.

An action is a task performed on a revision. Pipeline actions occur in a specified order, in serial or in parallel. The first stage must only contain a source action.

The stages in a pipeline are connected by transitions and are represented by arrows in the AWS Code Pipeline console. Once revisions are complete, actions in a stage will be automatically sent on to the next stage as indicated by the transition arrow. Transitions can be disabled or enabled between stages.

The pipeline structure has the following requirements:

A pipeline must contain at least two stages. The first stage of a pipeline must contain at least one source action and can only contain source actions.

Only the first stage of a pipeline may contain source actions.

At least one stage in each pipeline must contain an action that is not a source action.

All stage names within a pipeline must be unique.

Stage names cannot be edited within the AWS Code Pipeline console. If you edit a stage name by using the AWS CLI, and the stage contains an action with one or more secret parameters (such as an OAuth token), the value of those secret parameters will not be preserved. You must manually type the value of the parameters (which are masked by four asterisks in the JSON returned by the AWS CLI) and include them in the JSON structure.

The pipeline metadata fields are distinct from the pipeline structure and cannot be edited. When you update a pipeline, the date in the updated metadata field changes automatically.

When you edit or update a pipeline, the pipeline name cannot be changed.

5.       Setting up code pipeline

In 2017 AWS announced support for Amazon Elastic Container Service (ECS) for AWS Code Pipeline. This support makes it easier to create a continuous delivery pipeline for container-based applications and microservices. Amazon ECS and AWS CodeBuild will be used in the creation of a pipeline. We can then integrate AWS Code Pipeline and CodeBuild with ECS to automate your workflows in just a few steps.

To create a pipeline in AWS you first need to register. Aws code pipeline is free for the first month. After creating a AWS account, you could go straight to AWS code pipeline and create a pipeline straight away. When creating the pipeline, there are 6 steps to make before a pipeline is created. This is done to make sure AWS has everything it needs to create a pipeline for you. These 6 steps are:

Step 1: Create the name of the code pipeline.

Step 2: Create a source location. In this instance GitHub was chosen. When GitHub is selected, you then must connect AWS code pipeline with GitHub and declare what branch you wish to use. In this case, master was chosen.

Step 3: In step 3, we must build the pipeline, in this instance AWS CodeBuild was selected as build provider.  AWS CodeBuild is a fully managed continuous integration build service that compiles source code, runs tests, and produces software packages that are ready to deploy. With CodeBuild, you don’t need to provision, manage, and scale your own build servers. CodeBuild scales continuously and processes multiple builds concurrently, so your builds are not left waiting in a queue. Jenkins could also have been selected but I wanted to learn more about AWS tools.

After the build tool was selected, we then had to configure our project by creating a new build and naming our project in the pipeline with a description of the project.

After build we move to the environment of the project. The environment image was managed by AWS CodeBuild.

Figure 2: setting environment image

Ubuntu was selected as the operating system and Docker was selected as runtime environment. Below is a quick summary of everything in environment form.

Build provider: AWS CodeBuild

Project configuration: production build

Environment image: image managed by AWS CodeBuild

Operating System: Ubuntu

Runtime: Docker

Version: aws/codebuild/docker:1.12.1

Build specification: buildspec.yml in source code

After environment setting there were 2 other settings Cache and VPC. These were left as default as this was my first pipeline. All changes were saved and moved to step 4.

 

Step 4: In step 4, we need to configure our deploy settings. This is where your built code will be placed. Amazon ECS was chosen as development provider. Amazon ECS stands for Amazon Elastic Container Service, it is a highly scalable, high-performance container management service that supports Docker containers and allows us to easily run and scale containerized applications on AWS [4].

After selecting our provider, a set of expanded options will appear, we must choose a cluster name, search name and image filename. A cluster is a grouping of container instances and multiple cluster can be created [6]. Before giving a name to cluster name we had to create a cluster on amazon ECS. I choose the windows and networking template.

Figure 3: Selecting a cluster template in step 4

After choosing the Linux and Networking template I created my own cluster name “ecs-demo”. A cluster is a group of amazon EC2 virtual machines.

After choosing a cluster name I then created a service for service name and an image. A service is needed for scheduling when containers need to run. A service is created in amazon ECS. Before creating the service, a task definition and a container had to be created. When creating a container, a name, image, memory limit and ports were all created.

Deployment provider: Amazon ECS

Cluster name: ecs-demo

Service name: nginx

Image filename: nginxlatest.json

 

Step 5: In step 5, an AWS service role is created using an IAM. IAM is for identity and access management which helps control access to AWS resources. This allows us to access resources in our pipeline amazon account. Click on create role.  The role name that was created was AWS-CodePipeline-Service.

Step 6: Step 6 was a review of all the settings chosen for a pipeline. After accepting the review of our pipeline being created. In a new page a success alert would appear at the top of the page, letting me know if it was successful.

Figure 4: showing pipeline was created successfully.

Unfortunately, this did not happen as I could not get past step 4.

6.       Conclusion

A person or Software firm seeking to work with AWS code pipeline software need to understand continuous integration and continuous delivery as well as the pipeline itself before delving into AWS code pipeline.

I wanted to use this technology and this pipeline as I wanted a better understanding of both. AWS code pipeline was a lot harder than I thought. There was a lot more in aws code pipeline than I realized. I had to learn new software like amazon ECS and AWS CodeBuild all while trying to learn AWS pipeline. I mainly got stuck on step 4 when creating pipeline. I was stuck here as I had to create a cluster which I didn’t know how to create and got confused by amazon documentation as I thought the documentation was very messy and a bit too much information. I finally decided to create an empty cluster name which seemed to work.

After creating a cluster name, I then moved onto the service name. This is where I got really stuck and there was just too much documentation and information were scattered in many places on creating a service on Amazon ECS. I had to do a lot of things in Amazon ECS, I had to create a cluster and after I created a cluster I had to define a task definition and in the task definition I had to create a container for service name. I followed the documentation and done the 4 steps of creating a service but at the end I always got an error. Something wrong with container image “nginx”. I did not understand this error and could not fix this error.

Figure 5: Creation of a service error.

If I got step 4 complete I would have been able to complete my pipeline. If a pipeline was created, it would start the first build from GitHub repo using master branch which specified in step 2, the building of a pipeline. This first build would fail because our new IAM role that was created in step 5 does not have permissions to access the GitHub repo. To fix this, you need to go into IAM console dashboard and go to roles on the menu left hand side of the screen. In roles, go to the role you created which takes us to the summary page of the role. From there we need to go into attach policy and check “AmazonEC2ContainerRegisteryPowerUser” to give access to our IAM. Once this is attached, we are brought back to the IAM dashboard. If the new IAM role has a  power user is under policy name, then it now has privileges to access the repo. After the privileges are set, we could then work on the AWS code pipeline.

Looking back now, I wish I done a few things differently. I should have done more research on my topic to see what was involved etc, which would have prepared me better. I should have looked at other technologies too. Instead I just choose AWS code pipeline, as I wanted to learn about AWS services and wanted to learn more about creating and using a pipeline. AWS code pipeline was a lot bigger than what I had first thought. I had no idea AWS code Pipeline would use AWS CodeBuild, Amazon ECS and Docker containers. All this new software except for docker had to be learned too which threw me off the main focus of this research paper which was continuous integration and continuous delivery with AWS code pipeline.

https://aws.amazon.com/blogs/compute/set-up-a-continuous-delivery-pipeline-for-containers-using-aws-codepipeline-and-amazon-ecs/

References

[1]      Fowler, M. (2018). Continuous integration | ThoughtWorks. [online] Thoughtworks.com.Available at:https://www.thoughtworks.com/continuous-integration [Accessed 7 Sep. 2018].

[2]      Sacolick, I. (2018). What is CI/CD? Continuous integration and continuous delivery explained. [online] InfoWorld. Available at: https://www.infoworld.com/article/3271126/ci-cd/what-is-cicd-continuous-integration-and-continuous-delivery-explained.html[Accessed 8 Sep. 2018]

[3]      Zhou, A. (2018). The Principles of Continuous Integration and How It Maintains Clean Code and Increases Efficiency. [online] Forbes.com. Available at: https://www.forbes.com/sites/forbesproductgroup/2018/01/09/the-principles-of-continuous-integration-how-it-maintains-clean-code-and-increases-efficiency/#782d8f3c1920[Accessed 8 Sep. 2018].

[4]      Amazon Web Services, Inc. (2018). Amazon ECS – run containerized applications in production. [online] Available at: https://aws.amazon.com/ecs/[Accessed 1 Oct. 2018].

[5]      Fowler, M. (2018). Continuous Integration. [online] martinfowler.com. Available at: https://www.martinfowler.com/articles/continuousIntegration.html [Accessed 8 Sep. 2018].

[6]      Docs.aws.amazon.com. (2018). Amazon ECS Clusters – Amazon Elastic Container Service. [online] Available at: https://docs.aws.amazon.com/AmazonECS/latest/developerguide/ECS_clusters.html[Accessed 1 Oct. 2018].

[7]      Amazon Web Services, Inc. (2018). Amazon ECS – run containerized applications in production. [online] Available at: https://aws.amazon.com/ecs/ [Accessed 1 Oct. 2018].

 
 

How have PBR Applications Reshaped an Artist’s Pipeline?

Through technological innovation how have PBR Applications reshaped an Artist’s Pipeline?

Abstract.

This report will look at and discuss how technological advancements have revolutionized texturing methods and how 3D artists have had to adapt their workflow to suit the changing industry. I will explore the programmes, methods and techniques that have had the biggest impact on the industry, these include Quixel Suite, Substance Painter/ Designer, Unreal Engine and Photogrammetry. I will investigate the main features and applications within this field. My argument is supported through extensive research, data collection and studies which go into great depth about what features make PBR such a game changer. Within this report I will also include comparisons between traditional and next gen pineplines and how because of advancements assets can be created for a much lower cost and at a much higher speed.

Introduction.

The video games industry has been growing rapidly since the birth of the interactive media and with huge technological advancements within the industry the way in which they are created is ever changing, because of this, 3D artists must adapt and reshape their workflows to match that of what is required. In this report I will conduct an in-depth analysis on the practical use and application of physically based rendering and highlight key areas to showcase what is on offer and why it is so dominant within the industry.

Key Programme Features

I will be investigating key features from Substance Painter that have been the most influential and effective in changing an artists workflow and Industry standards.

This is a programme that I use myself and through observation during my time at university it is a staple for the majority of artists I have worked alongside. The programme itself is constantly changing and adapting features to not only increase the visual standard of its textures but also dramatically decrease the time it takes to complete each task.

Substance Painter Texture Available at : URL

Smart materials are a well known feature within substance and are a collection of layers similar to that of Photoshop but allow you to have much greater control whilst manipulating layers by being able to see it applied to your 3D asset in real time. Smart materials can be combined in the layering system with additional materials and masks to create much more intricate designs at a fraction of time. There are also deeper levels of modification within the materials themselves where you can edit values attaining to each individual map ( Normal, Spec, Emissive etc) as well as being able to paint materials within specific areas ( Edge wear, Rust, Crevice dirt etc), this gives the artist a huge advantage in being able to make an asset that is more visually powerful. In comparison, texturing with Photoshop would require high amounts of research and preparation including spending a much higher volume of time perfecting UV layouts in order to correctly paint wear, dust and dirt within appropriate areas.There is also a large amount of time consumed in applying your texture from photoshop to your asset for reference that is completely nullified by Substance’s ability to view your textures in either 3D or 2D so you can instantly see progress.

                                                                         Smart Materials Available at : URL

Baking Available at : URL

Another Key feature from Substance painter is an integrated renderer called Iray that renders assets to a very high quality with options to adjust lighting, higher sampling (Quality) and  environmental aspects. This makes it much easier for an artist to create renders for their portfolio or even  work in-progress snapshots to share with your colleagues or team members as they can be exported directly from substance painter.

Get Help With Your Essay
If you need assistance with writing your essay, our professional essay writing service is here to help!
Essay Writing Service

Baking in substance is done through its integrated GPU baker where it creates the required maps from the information gathered from the high poly asset. Baking is a process that is extremely effective within substance as dirt masks and weathering use this data to apply themselves accordingly to your asset. Being able to view your asset, textured in real-time you can adjust these sliders to suit your preference. This saves the artist a large amount of time being able to create multiple iterations and texture sets of an asset by simply hiding a layer and re exporting.

Substance painter gives you a huge amount of options in how to export your maps, this is something that is especially helpful when working in Unreal engine which works with combined AO, Roughness and metallic maps. What seem like simple features all aid in speeding up the process, using photoshop it would be much more difficult to obtain these maps, using either filters or separate programmes such as NDO which all consume more time.

Photogrammetry

Photogrammetry it appears is becoming a staple in the video games industry by using real world scans to create assets. This method is being used more and more by artists within the industry to create big titles, Star Wars : Battlefront , The Division 2 and The Vanishing of Ethan carter are just an example of a few.

There is a specific pipeline you must follow in order to create an asset through photogrammetry but the results are extremely impressive, leaving you with a realistic much more believable scene. An article summarizes that the cost of an individual standard Raspberry PI photogrammetry setup would total over twelve thousand Australian dollars, this would seem like a large amount of money but the article goes on to state that due to how much quicker photogrammetry is compared to older techniques that the amount of money you save from development suite subscriptions and living costs means that the studio actually saves money.

                                            Statue Photogrammetry Available at : URL

 To break that down further, using photogrammetry takes around 50% less time to complete each asset and leads to a much higher visual standard, it is completely cost effective and with additional time now created more effort can be directed towards immersive storylines and other aspects of the game.

Use in Industry 

The industry is at the forefront of using these techniques and programmes and because of this developers have improved their functionality and added new features upon request, this has meant that they are constantly maintaining a high standard and meeting the demands of the industry’s progression. With having a close connection to both industry professionals and the public these companies are highly responsive to feedback and implement new features regularly which you can check by viewing their roadmaps.

 Another benefit from having close ties is that certain companies build up huge libraries of materials with even the public using substance share to compile private libraries of their own. This is something that again lowers production time and costs by having access to thousands of materials from anywhere at any time that could be used on multiple projects within the same company.

 Allegorithmic and Quixel are a necessity to the majority of both AAA and Indie games created today. Quixel have worked closely with epic games for a number of years but more recently collaborated to showcase Megascans Icelandic Library with a short video : Rebirth, this really shows how immersive and realistic an environment can be with the aid of Qixels Photogrammetric library.

                                                                                  Rebirth : Megascans Icelandic Library. Available at : URL

A large portion of the biggest titles released such as Dead By Daylight, Ghost Recon : Wildlands and Shadow Of The Tomb Raider were textured using Substance painter/Designer. The senior technical artist at Eidos, Ken Jiang described how important Allegorithmic programmes are within the industry. “Substance is now one of the essential software tools of our art pipeline. It’s helpful because it cuts down our texturing time estimates dramatically. Its non-destructive nature makes it risk-free to iterate and optimize the final results.” (Jiang, 2018).

Both Allegorithmic and Quixel are constantly increasing the visual benchmark of games and trying to attain new heights in photo-realism. To reach these requirements visually the hardware powering the games has had to also increase to match, these programmes can both work and export in 8K textures but remain experimental features whilst the hardware needed becomes more commonplace.

Substance character texture. Available at : URL

Traditional Workflow Comparison

Traditional workflows require much more time and effort spent in areas that now because of advancements require very little and give much better results. During my own development through University I found my workflow constantly changing to match that of new advancements and looking back I can see how these features have streamlined my own process. Older pipelines required repetitive iterations of every asset that would need to routinely checked to see if colours were matching and suitable, this would get tedious having

to go between programmes and                                                           PBR Validate. Available at : URL

re -import almost constantly. This is something that Substance has

tackled with the PBR validate feature which checks whether RGB values

for certain maps are physically correct with how they act in reality. It does this by testing whether they are suitably within a specific range and mark areas too dark in red on the mesh in the window.

The Vanishing of Ethan Carter is a great example of how a traditional workflow has changed through the use of photogrammetry. Using Photoscan from Agisoft the Art team were able to create large environments that had been scanned from real world locations. By using this method it created highly realistic environments that had many intricate layers and a real sense of depth. This is something that is extremely hard to replicate by using a traditional pipeline, the scanned environments are very high quality and contain erosion, cracks and stains that have taken hundreds of years to develop into what we have in the real world. An artist using an old workflow would couldn’t even attempt to replicate the intricacy involved especially with the small timeframe they are designated during each assignment.

The Vanishing of Ethan Carter. Available at : URL

Being able to scan entire environments with programmes like Photoscan means that large portions of assets and environment locations can now be completed without using a traditional pipeline whatsoever and it has become a new pipeline within itself. 

Historic centerpieces and statues can now be photographed and scanned into a game in great detail and for this type of asset creation a traditional workflow is completely changed, from spending hours in programmes like Maya you now spend hours setting up cameras in unusual and interesting locations around the world. One important thing to note is that regardless of the advancements in scanning real world locations the traditional pipeline is still of much use, for

creating assets that do not exist in the real world and there is a large demand for that in the market.

Photogrammetric building. Available at : URL

Conclusion

To conclude, it is evident that programmes like Substance painter / designer and Quixel suite have had a huge positive impact on artists pipelines and substantially reshaped their workflow. For me the most important reason for this is these programmes allow an artist to create an asset that is visually appealing whilst maintaining real world values pertaining to each material. Having the ability to bake high levels of detail into low poly meshes and then adding further detailing through the use of smart masks with scratches and dirt etc, radically reduces the time I would have had to take adding all these details in individually. Being able to work effortlessly covering multiple aspects of asset production within one programme has revolutionised the ease of creating high quality assets. Having a real time viewport and near endless material sliders changing base colour, Crack intensity, Roughness etc means you can accomplish so much in such a short space of time.

Tileable textures through the use Quixel mean that entire procedurally generated maps can be quickly textured and blended to provide a realistic flooring in incredibly quick time, any small tweaks can be added via decals or vertex painting within engine during set dressing. This is a process that was almost unthinkable during previous gaming eras where flooring was usually low resolution near base colour blocks. Even in the more recent past flooring lacked depth through lack of normal mapping and only recently became a staple of AAA games.

It is becoming near impossible to find developers within the game industry that dont use a form of Allegorithmic or Quixel within their production and it is because of this that consumers are seeing a huge improvement in graphics for the next gen games at levels we’ve never seen before.

The old pipeline of using photoshop to create textures has become outdated due to the sheer versatility and convenience of programmes like substance, being able to combine so many libraries of materials, being able to edit each of their own specific values and on top of that apply whatever variation of smart masks your creativity knows no bounds.

Photogrammetry in 2019 is really starting to make more of an impact on the industry as the workflow has began to tweak where needed and refine problem areas. Many developers are using photogrammetry in combination with more traditional techniques, this is by using photogrammetry on more complex assets that exist in the real world where an artist would spend many days or week modelling it can be completed at a fraction of that. Now we are also seeing the scale of what is possible to 3D scan increase to that of small scale environments and the results are very impressive. At our current stage we can scan replica waterfalls, graveyards and buildings, I can only see the level of what it possible increasing.

The games industry is constantly evolving and the visual benchmark is always getting higher, it is programmes like Substance and Quixel that will have to constantly add new features and expand their functionality to stay at the cutting edge. Even during my 4 years at university I have seen a dramatic change in my workflow as an artist and had to adapt to new ways of producing industry standard assets. It’s going to be interesting to see what advancements these programmes make over the coming years.      

References

      Rebirth: Introducing photorealism in UE4.  [Online] Available at: https://www.youtube.com/watch?v=9fC20NWhx4s [Accessed 10th July 2019]

 

How have PBR Applications Reshaped an Artist’s Pipeline?

Through technological innovation how have PBR Applications reshaped an Artist’s Pipeline?

Abstract.

This report will look at and discuss how technological advancements have revolutionized texturing methods and how 3D artists have had to adapt their workflow to suit the changing industry. I will explore the programmes, methods and techniques that have had the biggest impact on the industry, these include Quixel Suite, Substance Painter/ Designer, Unreal Engine and Photogrammetry. I will investigate the main features and applications within this field. My argument is supported through extensive research, data collection and studies which go into great depth about what features make PBR such a game changer. Within this report I will also include comparisons between traditional and next gen pineplines and how because of advancements assets can be created for a much lower cost and at a much higher speed.

Introduction.

The video games industry has been growing rapidly since the birth of the interactive media and with huge technological advancements within the industry the way in which they are created is ever changing, because of this, 3D artists must adapt and reshape their workflows to match that of what is required. In this report I will conduct an in-depth analysis on the practical use and application of physically based rendering and highlight key areas to showcase what is on offer and why it is so dominant within the industry.

Key Programme Features

I will be investigating key features from Substance Painter that have been the most influential and effective in changing an artists workflow and Industry standards.

This is a programme that I use myself and through observation during my time at university it is a staple for the majority of artists I have worked alongside. The programme itself is constantly changing and adapting features to not only increase the visual standard of its textures but also dramatically decrease the time it takes to complete each task.

Substance Painter Texture Available at : URL

Smart materials are a well known feature within substance and are a collection of layers similar to that of Photoshop but allow you to have much greater control whilst manipulating layers by being able to see it applied to your 3D asset in real time. Smart materials can be combined in the layering system with additional materials and masks to create much more intricate designs at a fraction of time. There are also deeper levels of modification within the materials themselves where you can edit values attaining to each individual map ( Normal, Spec, Emissive etc) as well as being able to paint materials within specific areas ( Edge wear, Rust, Crevice dirt etc), this gives the artist a huge advantage in being able to make an asset that is more visually powerful. In comparison, texturing with Photoshop would require high amounts of research and preparation including spending a much higher volume of time perfecting UV layouts in order to correctly paint wear, dust and dirt within appropriate areas.There is also a large amount of time consumed in applying your texture from photoshop to your asset for reference that is completely nullified by Substance’s ability to view your textures in either 3D or 2D so you can instantly see progress.

                                                                         Smart Materials Available at : URL

Baking Available at : URL

Another Key feature from Substance painter is an integrated renderer called Iray that renders assets to a very high quality with options to adjust lighting, higher sampling (Quality) and  environmental aspects. This makes it much easier for an artist to create renders for their portfolio or even  work in-progress snapshots to share with your colleagues or team members as they can be exported directly from substance painter.

Get Help With Your Essay
If you need assistance with writing your essay, our professional essay writing service is here to help!
Essay Writing Service

Baking in substance is done through its integrated GPU baker where it creates the required maps from the information gathered from the high poly asset. Baking is a process that is extremely effective within substance as dirt masks and weathering use this data to apply themselves accordingly to your asset. Being able to view your asset, textured in real-time you can adjust these sliders to suit your preference. This saves the artist a large amount of time being able to create multiple iterations and texture sets of an asset by simply hiding a layer and re exporting.

Substance painter gives you a huge amount of options in how to export your maps, this is something that is especially helpful when working in Unreal engine which works with combined AO, Roughness and metallic maps. What seem like simple features all aid in speeding up the process, using photoshop it would be much more difficult to obtain these maps, using either filters or separate programmes such as NDO which all consume more time.

Photogrammetry

Photogrammetry it appears is becoming a staple in the video games industry by using real world scans to create assets. This method is being used more and more by artists within the industry to create big titles, Star Wars : Battlefront , The Division 2 and The Vanishing of Ethan carter are just an example of a few.

There is a specific pipeline you must follow in order to create an asset through photogrammetry but the results are extremely impressive, leaving you with a realistic much more believable scene. An article summarizes that the cost of an individual standard Raspberry PI photogrammetry setup would total over twelve thousand Australian dollars, this would seem like a large amount of money but the article goes on to state that due to how much quicker photogrammetry is compared to older techniques that the amount of money you save from development suite subscriptions and living costs means that the studio actually saves money.

                                            Statue Photogrammetry Available at : URL

 To break that down further, using photogrammetry takes around 50% less time to complete each asset and leads to a much higher visual standard, it is completely cost effective and with additional time now created more effort can be directed towards immersive storylines and other aspects of the game.

Use in Industry 

The industry is at the forefront of using these techniques and programmes and because of this developers have improved their functionality and added new features upon request, this has meant that they are constantly maintaining a high standard and meeting the demands of the industry’s progression. With having a close connection to both industry professionals and the public these companies are highly responsive to feedback and implement new features regularly which you can check by viewing their roadmaps.

 Another benefit from having close ties is that certain companies build up huge libraries of materials with even the public using substance share to compile private libraries of their own. This is something that again lowers production time and costs by having access to thousands of materials from anywhere at any time that could be used on multiple projects within the same company.

 Allegorithmic and Quixel are a necessity to the majority of both AAA and Indie games created today. Quixel have worked closely with epic games for a number of years but more recently collaborated to showcase Megascans Icelandic Library with a short video : Rebirth, this really shows how immersive and realistic an environment can be with the aid of Qixels Photogrammetric library.

                                                                                  Rebirth : Megascans Icelandic Library. Available at : URL

A large portion of the biggest titles released such as Dead By Daylight, Ghost Recon : Wildlands and Shadow Of The Tomb Raider were textured using Substance painter/Designer. The senior technical artist at Eidos, Ken Jiang described how important Allegorithmic programmes are within the industry. “Substance is now one of the essential software tools of our art pipeline. It’s helpful because it cuts down our texturing time estimates dramatically. Its non-destructive nature makes it risk-free to iterate and optimize the final results.” (Jiang, 2018).

Both Allegorithmic and Quixel are constantly increasing the visual benchmark of games and trying to attain new heights in photo-realism. To reach these requirements visually the hardware powering the games has had to also increase to match, these programmes can both work and export in 8K textures but remain experimental features whilst the hardware needed becomes more commonplace.

Substance character texture. Available at : URL

Traditional Workflow Comparison

Traditional workflows require much more time and effort spent in areas that now because of advancements require very little and give much better results. During my own development through University I found my workflow constantly changing to match that of new advancements and looking back I can see how these features have streamlined my own process. Older pipelines required repetitive iterations of every asset that would need to routinely checked to see if colours were matching and suitable, this would get tedious having

to go between programmes and                                                           PBR Validate. Available at : URL

re -import almost constantly. This is something that Substance has

tackled with the PBR validate feature which checks whether RGB values

for certain maps are physically correct with how they act in reality. It does this by testing whether they are suitably within a specific range and mark areas too dark in red on the mesh in the window.

The Vanishing of Ethan Carter is a great example of how a traditional workflow has changed through the use of photogrammetry. Using Photoscan from Agisoft the Art team were able to create large environments that had been scanned from real world locations. By using this method it created highly realistic environments that had many intricate layers and a real sense of depth. This is something that is extremely hard to replicate by using a traditional pipeline, the scanned environments are very high quality and contain erosion, cracks and stains that have taken hundreds of years to develop into what we have in the real world. An artist using an old workflow would couldn’t even attempt to replicate the intricacy involved especially with the small timeframe they are designated during each assignment.

The Vanishing of Ethan Carter. Available at : URL

Being able to scan entire environments with programmes like Photoscan means that large portions of assets and environment locations can now be completed without using a traditional pipeline whatsoever and it has become a new pipeline within itself. 

Historic centerpieces and statues can now be photographed and scanned into a game in great detail and for this type of asset creation a traditional workflow is completely changed, from spending hours in programmes like Maya you now spend hours setting up cameras in unusual and interesting locations around the world. One important thing to note is that regardless of the advancements in scanning real world locations the traditional pipeline is still of much use, for

creating assets that do not exist in the real world and there is a large demand for that in the market.

Photogrammetric building. Available at : URL

Conclusion

To conclude, it is evident that programmes like Substance painter / designer and Quixel suite have had a huge positive impact on artists pipelines and substantially reshaped their workflow. For me the most important reason for this is these programmes allow an artist to create an asset that is visually appealing whilst maintaining real world values pertaining to each material. Having the ability to bake high levels of detail into low poly meshes and then adding further detailing through the use of smart masks with scratches and dirt etc, radically reduces the time I would have had to take adding all these details in individually. Being able to work effortlessly covering multiple aspects of asset production within one programme has revolutionised the ease of creating high quality assets. Having a real time viewport and near endless material sliders changing base colour, Crack intensity, Roughness etc means you can accomplish so much in such a short space of time.

Tileable textures through the use Quixel mean that entire procedurally generated maps can be quickly textured and blended to provide a realistic flooring in incredibly quick time, any small tweaks can be added via decals or vertex painting within engine during set dressing. This is a process that was almost unthinkable during previous gaming eras where flooring was usually low resolution near base colour blocks. Even in the more recent past flooring lacked depth through lack of normal mapping and only recently became a staple of AAA games.

It is becoming near impossible to find developers within the game industry that dont use a form of Allegorithmic or Quixel within their production and it is because of this that consumers are seeing a huge improvement in graphics for the next gen games at levels we’ve never seen before.

The old pipeline of using photoshop to create textures has become outdated due to the sheer versatility and convenience of programmes like substance, being able to combine so many libraries of materials, being able to edit each of their own specific values and on top of that apply whatever variation of smart masks your creativity knows no bounds.

Photogrammetry in 2019 is really starting to make more of an impact on the industry as the workflow has began to tweak where needed and refine problem areas. Many developers are using photogrammetry in combination with more traditional techniques, this is by using photogrammetry on more complex assets that exist in the real world where an artist would spend many days or week modelling it can be completed at a fraction of that. Now we are also seeing the scale of what is possible to 3D scan increase to that of small scale environments and the results are very impressive. At our current stage we can scan replica waterfalls, graveyards and buildings, I can only see the level of what it possible increasing.

The games industry is constantly evolving and the visual benchmark is always getting higher, it is programmes like Substance and Quixel that will have to constantly add new features and expand their functionality to stay at the cutting edge. Even during my 4 years at university I have seen a dramatic change in my workflow as an artist and had to adapt to new ways of producing industry standard assets. It’s going to be interesting to see what advancements these programmes make over the coming years.      

References

      Rebirth: Introducing photorealism in UE4.  [Online] Available at: https://www.youtube.com/watch?v=9fC20NWhx4s [Accessed 10th July 2019]

 

The 3D Graphics Production Pipeline

As humans that are constantly exposed to high quality images and computer graphics, we expect a high level of ‘real’-ness from 3D graphics. To achieve this hyper-realism there are many stages involved, called a 3D graphics production pipeline.
A 3D graphics production pipeline can be seen as an assembly line of sorts, with different pieces of the final asset assembled piece-by-piece by teams of different people.
A production pipeline can be broken down into pre-production, production and post-production. Each stage involves several different steps. A production team often progresses sequentially through these stages, but a good pipeline should be flexible enough to allow for iterations and tweaks, if needed, throughout the project.

Preproduction is the planning and designing stage of the pipeline. This stage is necessary as it lays the groundwork for the entire project and enables the management and artists to manage the pipeline, in terms of assets and financial capabilities.
Preproduction is usually the longest part of a project as it requires an understanding of the project aim. This stage is where the research of the subject matter takes place to have it consistent with the real-life object or an objects behaviour in certain conditions (Beane, 2012).
There are 5 components to a preproduction stage: Idea/Story, Script/Screenplay, Storyboard, Animatic and Design.
1. Idea/Story
The first step in any production is to come up with the idea of the product. This idea can often come from nowhere in particular and at any time. This is where you can get creative and come up with lots of different ideas. There is a lot of trial and error in this stage. To quote Andy Beane, “You must be willing to kill your babies[ideas] at any time.” This is to stop a fixation on one idea. If the idea is not good enough, you must be able to stop and bin that idea (Beane, 2012). 
When you have settled on an idea that you feel could work, you then turn it into a story. This will be a rough outline of the narrative arc and what message you want your animation/graphic to tell. Creators must create a story, regardless of the project, that will captivate consumers.
2. Script/Screenplay
The script will detail all the onscreen movements, actions, dialogue etc. This will help people on various stages of the production pipeline to understand where objects are going to be places, their movements and any sounds that will be in the animation.
3. Storyboard
A storyboard contains thumbnail sketches of a scene, typically with camera shots, lighting, camera movement etc. This is more common for a 3D artist to use than a script as they tend to visualise things easier than writing them down. The storyboard is an important piece of the production pipeline as all levels of the team can see what is supposed to be happening on screen.

Get Help With Your Essay
If you need assistance with writing your essay, our professional essay writing service is here to help!
Essay Writing Service

4. Animatic
An animatic is essentially a moving version of the storyboard with temporary dialogue and sound effects to give the production team a sense of the pace of the animation. This is where the editors will look at the shots and movements and if they do not work well together, they will be edited here. In 3D animation, cutting shots or sequences is rarely done as it is too expensive, so those decisions are made in preproduction. That is why an animatic is very important to the artists and the higher-ups (Beane, 2012).
5. Design
This is where the overall look of the project is decided. It’s at this stage that concept art for characters, objects and backgrounds is created by concept artists. A concept artist will create concept art that will display the mood that is required for the project (Beane, 2012).
The artist will create a turnaround, environmental sketch/design and these will be passed on to appropriate artists/modellers.

If the preproduction stage is carried out well, then the production stage will go smoother. The output from the preproduction stage is handed over to the artists in the production stage and they will work simultaneously on the different assets required.
The production stage contains various components such as Layout, Modelling, Texturing, Rigging, Animation, VFX, Lighting and Rendering.
Layout
The process in layout is essentially turning a 2D animatic into a 3D layout. This layout will give the objects/scenes perspective, depth and scale that may be missing in the 2D version. Also, camera shots can be tested in a way that isn’t possible on the 2D version. 
The layout will be a reference to every other stage in production and will be continuously updated throughout it (Beane, 2012).
Modelling
Modelling is the process of creating meshes that are a representation of physical objects. A mesh is made up of vertices, edges and faces, which are elements of polygons. These polygons together can form 3D surfaces which can be manipulated to create the desired object.
An alternative to polygons is Non-Uniform Rational B-Splines (NURBs) which are used when very accurate meshes are needed. They are formed using control points, isoparms and surfaces. NURBs are often used for more highly engineered projects that require more accuracy (Hix, 2016).
For a 3D animation production, the modeller will refer to the 3D Layout, concept sketches and any other assets passed from the pre-production stage. The modeller will add finer details to make it look realistic, to a certain extent.
Texturing
Texturing is where a 3D artist applies different colours and surface to a modelling asset to make it look more like the concept art. The asset arrives from the 3D Modeller (or it could be the same artist) as a flat colour such as grey.
When the texture artist has received the model, they must first make a UV map of the model. This is 2D image surface area of the 3D asset. This UV map is then painted and shaded to make the material look like what the real-life version of it would. This 2D image is then wrapped back around the 3D asset (Adib, 2019).
Rigging
Rigging is the process of inserting a control rig that enables movement into a static geometric object. Every object that you would want to move will require a control system. The control rig can be simple (parent/child relationships) or complex (joints, skinning, muscles, controllers). Regardless of whether the rig is simple or complex, it needs to be done correctly and effectively to make life easier for the animator that will receive the asset next (Beane, 2012).
Animation
The movement of an object or character asset received from the rigger is created within the animation stage. Animators must convey the object/character in a way that is believable. To achieve this, they must follow the 12 principles of animation (Fig 1.)
Figure 1: 12 Principles of Animation (National Film and Television School, n.d.)
There are three main ways to animate within 3D animation; keyframing, motion capture and procedural
1. Keyframing
The animator can manipulate an object by altering its position, rotation and scale, to name some at a keyframe. The animator then moves along a timeline to another keyframe and changes an element(s). The application then assumes the in-between frames and creates the movement.
2. Motion Capture
An actor’s movement are captured via cameras and a special suit and this is applied to the control rig of an object.
3. Procedural
The object is animated by executing code written by a programmer.
Animation is a very important aspect in the 3D graphics pipeline, as it can make or break a project. If the character is moving in an awkward way, that is the only thing that will be focused on.
VFX
The Visual Effects Artist animates everything else except for the characters and the objects they interact with. Software that the VFX artist will operate will work from maths and physics to mimic real world situations such as gravity and air. The VFX artist must have a basic level of maths and physics to make educated edits. VFX artists must have those logical/technical skills but must also have an artistic flare.
The VFX artist aim is not to have flashy effects, but rather to enhance the scene that they are applying those effects to (Beane, 2012).
Lighting/Rendering
Lighting makes a big difference to the realism of an object and it can drastically affect the mood and focus within a scene. Using the principles of light from photography, a 3D artist could mimic real life situations.
Artists could also use High Dynamic Range Images (HDRI) which is where multiple images are captured with different exposure settings. These images are combined, containing lighting data from the brightest and darkest areas of the scene. These HDRI’s provide accurate lighting and environmental reflections (Hix, 2016).
Rendering is where data from the scene is translated into images. The final render is influenced by how the other steps in production were executed. Each scene is rendered into many different layers (render passes) like backgrounds, foregrounds, highlights, shadows etc. This allows for fine tuning of individual assets.
Rendering can be completed in real-time or non-real time.
Real-time rendering is when the data can be rendered fast enough to display it in real-time. This works things like video games and media with interactivity.
Non-real time is when there is a high level of detail and more time for rendering is required. This is used more for film or animation where there are lots of assets (Adib, 2019). Therefore render-farms are sometimes used. A render farm is when one machine is used solely for rendering so there is less of a bottleneck within the production.

This stage is when the final output assets are created. If proper care was taken in preproduce to try to foresee problems, then this stage could go relatively smooth. If there are changes the postproduction team may try to make them.
This stage is made up of the following components; Compositing, VFX/Motion Graphics, Colour Correction and Final Output.
1. Compositing
This is a process where the render passes from the rendering stage are assembled again. More adjustments or extra images can also be added in at this stage.
2. VFX/Motion Graphics
This can sometimes be mixed into the compositing process. The artist who carries theses VFX and motion graphics can sometimes be the same person. They add simple effects to enhance scenes. Examples of such effects are dust, rain and camera shake. They can also create title sequences and other graphics for the production (Beane, 2012).
3. Colour Correction
Sometimes referred to as colour grading or colour timing, this process entails adjusting colour levels in each shot to ensure consistency throughout the project. This requires a high level of skill and a technical but creative mind. This is the last step of the production before exporting the files (Beane, 2012).
4. Final Output
An asset can be exported into many different formats. Each form has their own advantages and disadvantages, so naturally they have their own workflows. The most common type is digital video format. This format is more widely accessible due to its ability to be played on computers and online (Beane, 2012).

The RE@CT project began in 2011, where a team of Peter Schubel, Jim Easterbrook, Graham Thomas, Oliver Grau, Alia Sheikh and Florian Schweiger aimed to revolutionise the production of realistic 3D characters.
The RE@CT project aimed to create film-quality interactive characters from 3D video capture of an actor’s performance, without the aid of tracking markers to allow for a much more natural and unhindered performance. This would create more lifelike results. This interactive feature means that the camera and 3D character animation could be controlled live by the user.
To capture this footage, a multi-cam setup is used. It involves 9 full HD cameras(1080p/25) on an evenly spaced rig, approximately 280 metres off the ground. This setup would be accompanied by 4 UHD cameras(2160p/50) at chest height in the four corners of a room. A python-based capture system, with synchronised video streams would capture the footage. The algorithm would then analyse the actor’s silhouette and assign texture coordinates to the resulting 3D mesh.
Paired with a head mounted capture system and a 360 Degree setup of seven DSLR stereo pairs, this process can capture highly detailed forms. An example of some of the facial features this setup can capture is shown in Fig 2 below.
Figure 2: Head Model rendered with different facial expressions (European Framework 7 , 2015)
This technology was used in an augmented reality game called Les Seigneurs de Montfort. It rendered 3D produced characters produced with RE@CT. The animations weren’t just of characters walking either, they were more interactive in a way that the characters took part in sword fights and interacted with each other.
RE@CT worked with BBC on their iWonder projects such as an interactive ballerina video(react ballet) that contains elements of the RE@CT methodology (European Framework 7 , 2015). 

A pipeline is a complex set of tasks and not every pipeline will be the same. There could be changes that would make you have to repeat a step again or add something in. It brings together groups of specialists from different areas and they work together to complete a final output. They all play a part in the making of a 3D Graphic.
I would personally love to be a fly on the wall at various stages of the production, just to see how things would work in real life rather than reading about it.

Adib, P., 2019. Getting to know 3D texturing in animation production. [Online]Available at: https://dreamfarmstudios.com/blog/getting-to-know-3d-texturing-in-animation-production/[Accessed 20 Feburary 2020].
Adib, P., 2019. The Final step in 3D animation production 3D Rendering. [Online]Available at: https://dreamfarmstudios.com/blog/the-final-step-in-3d-animation-production-3d-rendering/[Accessed 20 Feburary 2020].
Beane, A., 2012. In: 3D Animation Essentials. s.l.:Sybex, pp. 21-45.
European Framework 7 , 2015. RE@CT: A NEW PRODUCTION PIPELINE FOR INTERACTIVE 3D CONTENT, s.l.: s.n.
Hix, B., 2016. Making Sense of the 3D Production Pipeline. [Online]Available at: http://www.blenderunleashed.com/tutorials/making-sense-of-the-3d-production-pipeline/[Accessed 22 Febuary 2020].
National Film and Television School, n.d. Principles of Animation – Explore Animation. [Online]Available at: https://www.futurelearn.com/courses/explore-animation/0/steps/12228[Accessed 23 Feburary 2020].