You can use the function of dtype()
to check the data type of columns for Pandas dataframes. You can either check a single column or all the columns. The following is the sample code.
Check Data Type for All Columns in Pandas
import pandas as pd
car_data = {'Brand': ['Tesla', 'Tesla','Tesla','Ford'],
'Location': ['CA', 'CA','NY','MA'],
'Year':[2019,2018,2020,2019],
'DateTime':[pd.Timestamp('20190310'), pd.Timestamp('20180311'), pd.Timestamp('20200101'), pd.Timestamp('20190324')]}
car_data=pd.DataFrame(data=car_data)
print(car_data)
Brand Location Year DateTime 0 Tesla CA 2019 2019-03-10 1 Tesla CA 2018 2018-03-11 2 Tesla NY 2020 2020-01-01 3 Ford MA 2019 2019-03-24
# Check data types of all columns
car_data.dtypes
Brand object Location object Year int64 dtype: object
Check Data Types of a Specific Column
# Check Data Types of a Specific Column
car_data.Year.dtypes
dtype('int64')
# Check Data Types of a Specific Column
car_data.DateTime.dtypes
dtype('<M8[ns]')
Note that, datetime64[ns]
is a general dtype, while <M8[ns]
is a specific dtype.
Additional Note
Note that, in Pandas, it uses object
to indicate that it is string
. The following is the quote from Stackoverflow about this topic. This is the link for the disucssion on Stackoverflow.
Since release 0.11.1 there is an auto-conversion from
dtype=str
todtype=object
whenever it is seen, so it does not matter what you use, although I would advise avoidingstr
altogether and just usedtype=object
.