QAC020C155A Data Modelling And SQL Language

Answer:
Introduction

Database design is a very crucial process and requires in-depth analysis of the business requirements. There are several factors that should be focused on while designing a database such as data integrity, security, speed of access, data redundancy among other factors (Sul, Yeom and Jung, 2018). The database holds crucial information that the businesses use to execute its daily operations and in making vital decisions. Therefore, the database should be kept secured at all time to ensure that there is no unauthorized access or modification of any kind (Galatescu, Greceanu and Nicolau, 2011).

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper
Conceptual Model
Entities and Attributes

Customer

-CustomerID

-CustomerName

-CustomerAddress

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

-PhoneNumber

 

Employee

-EmployeeID

-EmployeeName

-EmployeeAddress

-EmployeeJobTitle

-EmployeeQualification

 

Order

-OrderID

-OrderDate

-OrderQuantity

-Products

 

 

Product

-ProductID

-ProductName

-ProductPrice

-ProductQuantity

The assumptions made the address and the name of the business have been prefixed with Customer in customer table to maintain unique name attributes. The total price that should be paid by the customer is calculated by multiplying product price and quantity.

Conceptual Data Model

The existence of an entity is independent of other entities. Child entity primary key is not contained in parent entity. the existence of child entity is dependent on the parent entity.

Logical Model
Database Normalization

Normalization is a database technique of reducing data redundancies and is used when redesigning and designing a database (Leong, 2011). It is a set of guidelines and processes that is used to that is reduce data redundancy when designing a database. Normal forms are the actual normalization guidelines. A database that is not normalized increases disk usage, reduces database efficiency, affects data integrity, security threats, and reduces speed of processing queries.

UNF

This type of normal form contains all the data in one table and is repeated in other subsequent tables.

1NF

The goal of 1NF is to break down the base data into tables or logical units. After designing each table, a primary key is assigned to all of them.

First normal form has been achieved by breaking the data down into logical units containing related information as shown in figure 3 above. Each table is assigned a primary key to ensure that there is no repetition in any of the group (Demba, 2013). We now have smaller and more manageable tables instead of one big table.

2NF

The goal of the second normal form is to further extract data that is partially dependent on the primary key to create another table.

First normal form is used to derive second normal form according to the figure above by further breaking down the tables in 1NF to more specific logical units.

In this case we took the employee table and further broke it down to get two more tables one containing position description and the other containing and another containing payment information. In this case, position details do not need to reside in employee table. ‘

Normalization offers many advantages to a database including minimized data redundancy, more flexible design of database, greater organization of database, better security handling in the database, and improve data consistency within the database (Duggal, Srivastav and Kaur, 2014).

ER Diagram

Denormalization is a technique of optimizing the database when redundant data to some tables. This will help the business to evade costly process of joins in relational database. However, this does not mean that normalization is not important as it is and denormalization is only required in specific cases. Denormalization is normally carried after normalization has been done. The assumption made with denormalization is that the users are comfortable with putting some more effort while updating the database and few redundancies (Hutcheson, 2012). Some of the benefits of denormalization include increased speed of data retrieval because of reduced fewer joins and more simpler queries for retrieving data. However, it denormalization has some setbacks including inserts and updates are costlier, increases data inconsistency, more storage is required is needed because of data redundancy, and makes inserts and updates more difficult to write.

In case Ushop System requires expansion/ scalability, denormalized and normalized database elements can be employed. Denormalization is carried out to free relations collection from unnecessary updates, inserts, and deletion dependencies (Sharmila and Anusya, 2012). It also minimizes the need to restructure capturing of relationships when new data types are introduced into the database, ensure that users are informed well on relational model, and make neutral relations to query statistics.

Conclusion

The database holds crucial information that the businesses use to execute its daily operations and in making vital decisions as such it should be kept secured at all times. Normalization is a database technique of reducing data redundancies and is used when redesigning and designing a database. It is a set of guidelines and processes that is used to that is reduce data redundancy when designing a database. Normalization offers many advantages to a database including minimized data redundancy, more flexible design of database, greater organization of database, better security handling in the database, and improve data consistency within the database. Denormalization is normally carried after normalization has been done. The assumption made with denormalization is that the users are comfortable with putting some more effort while updating the database and few redundancies.

Reference List

Demba, M. (2013). Algorithm for Relational Database Normalization Up to 3NF. International Journal of Database Management Systems, 5(3), pp.39-51.

Duggal, K., Srivastav, A. and Kaur, S. (2014). Gamified Approach to Database Normalization. International Journal of Computer Applications, 93(4), pp.47-53.

Galatescu, A., Greceanu, T. and Nicolau, D. (2011). Using a semantically enhanced database for business service and process modelling and integration. International Journal of Intelligent Information and Database Systems, 5(5), p.468.

Hutcheson, G. (2012). Missing Data: data replacement and imputation. Journal of Modelling in Management, 7(2).

Leong, L. (2011). The Issues And Solutions Of Integrating DBMS To A Multi-DBMS. Review of Business Information Systems (RBIS), 7(2), p.73.

Sharmila, S. and Anusya, A. (2012). Database Design for Scheduling System using Normalization and SQL. Paripex – Indian Journal Of Research, 3(7), pp.1-4.

Sul, W., Yeom, H. and Jung, H. (2018). Towards Sustainable High-Performance Transaction Processing in Cloud-based DBMS. Cluster Computing.