You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You might be wondering, how we created this database from our csv files. Most databases have some function to help you import csv files into databases. Note that since there is not data modeling (does not have to be normalized or tidy) constraints nor data type constraints a lot things can go wrong. This is a great opportunity to implement a QA/QC on your data and help you to keep clean and tidy moving forward as new data are collected. As an example, here's
293
+
You might be wondering how we created this database from our csv files. Most databases provide functions to import data from csv and other types of files. It is also possible to load data into the database programmatically from within R, one row at a time, using insert statements, but it is more common to load data from csv files. Note that since there is little data modeling within a csv file (the data does not have to be normalized or tidy), and no data type or value constraints can be enforced, a lot things can go wrong. Putting data in a database is thus a great opportunity to implement QA/QC and help you keep your data clean and tidy moving forward as new data are collected.
294
+
295
+
To look at one example, below is the SQL code that was used to create the `Bird_eggs` table:
294
296
295
297
```{sql eval=FALSE}
296
298
CREATE TABLE Bird_eggs (
@@ -309,11 +311,11 @@ CREATE TABLE Bird_eggs (
309
311
COPY Bird_eggs FROM 'ASDN_Bird_eggs.csv' (header TRUE);
310
312
```
311
313
312
-
DuckDB's `COPY` SQL command reads a csv file into a database table. Had we not already created the table in the previous statement, DuckDB would have created a table automatically and guessed at column names and data types. But by explicitly declaring the table, we are able to better characterize the data. Notable in the above:
314
+
DuckDB's `COPY` SQL command reads a csv file into a database table. Had we not already created the table in the previous statement, DuckDB would have created it automatically and guessed at column names and data types. But by explicitly declaring the table, we are able to add more characterization to the data. Notable in the above:
313
315
314
316
-`NOT NULL` indicates that missing values are not allowed.
315
-
- Constraints (e.g., `Egg_num BETWEEN 1 and 20`) express expectations about the data and either.
317
+
- Constraints (e.g., `Egg_num BETWEEN 1 and 20`) express our expectations about the data.
316
318
- A `FOREIGN KEY` declares that a value must refer to an existing value in another table, i.e., it must be a reference.
317
319
- A `PRIMARY KEY` identifies a quantity that should be unique within each row, and that serves as a row identifier.
318
320
319
-
Understand that a table declaration serves as documentation, the database actually
321
+
Understand that a table declaration serves as more than documentation; the database actually enforces constraints.
0 commit comments