committing pending changes

This commit is contained in:
Gourav Kumar 2023-03-25 11:18:25 +05:30
parent 3a5ca91234
commit 6cf56ddf11
3 changed files with 81 additions and 10 deletions

View File

@ -1,22 +1,28 @@
# PyFacts # PyFacts
PyFacts stands for Python library for Financial analysis and computations on time series. It is a library which makes it simple to work with time series data. PyFacts stands for Python library for Financial analysis and computations on time series. It is a library which makes it simple to work with time series data.
Most libraries, and languages like SQL, work with rows. Operations are performed by rows and not by dates. For instance, to calculate 1-year rolling returns in SQL, you are forced to use either a lag of 365/252 rows, leading to an approximation, or slow and cumbersome joins. PyFacts solves this by allowing you to work with dates and time intervals. Hence, to calculate 1-year returns, you will be specifying a lag of 1-year and the library will do the grunt work of finding the most appropriate observations to calculate these returns on. Most libraries, and languages like SQL, work with rows. Operations are performed by rows and not by dates. For instance, to calculate 1-year rolling returns in SQL, you are forced to use either a lag of 365/252 rows, leading to an approximation, or slow and cumbersome joins. PyFacts solves this by allowing you to work with dates and time intervals. Hence, to calculate 1-year returns, you will be specifying a lag of 1-year and the library will do the grunt work of finding the most appropriate observations to calculate these returns on.
## The problem ## The problem
Libraries and languages usually don't allow comparison based on dates. Calculating month on month or year on year returns are always cumbersome as users are forced to rely on row lags. However, data always have inconsistencies, especially financial data. Markets don't work on weekends, there are off days, data doesn't get released on a few days a year, data availability is patchy when dealing with 40-year old data. All these problems are exacerbated when you are forced to make calculations using lag. Libraries and languages usually don't allow comparison based on dates. Calculating month on month or year on year returns are always cumbersome as users are forced to rely on row lags. However, data always have inconsistencies, especially financial data. Markets don't work on weekends, there are off days, data doesn't get released on a few days a year, data availability is patchy when dealing with 40-year old data. All these problems are exacerbated when you are forced to make calculations using lag.
## The Solution ## The Solution
PyFacts aims to simplify things by allowing you to: PyFacts aims to simplify things by allowing you to:
* Compare time-series data based on dates and time-period-based lag
* Easy way to work around missing dates by taking the closest data points - Compare time-series data based on dates and time-period-based lag
* Completing series with missing data points using forward fill and backward fill - Easy way to work around missing dates by taking the closest data points
* Use friendly dates everywhere written as a simple string - Completing series with missing data points using forward fill and backward fill
- Use friendly dates everywhere written as a simple string
## Creating a time series ## Creating a time series
Time series data can be created from a dictionary, a list of lists/tuples/dicts, or by reading a csv file. Time series data can be created from a dictionary, a list of lists/tuples/dicts, or by reading a csv file.
Example: Example:
``` ```
>>> import pyfacts as pft >>> import pyfacts as pft
@ -33,6 +39,7 @@ Example:
``` ```
### Sample usage ### Sample usage
``` ```
>>> ts.calculate_returns(as_on='2021-04-01', return_period_unit='months', return_period_value=3, annual_compounded_returns=False) >>> ts.calculate_returns(as_on='2021-04-01', return_period_unit='months', return_period_value=3, annual_compounded_returns=False)
(datetime.datetime(2021, 4, 1, 0, 0), 0.6) (datetime.datetime(2021, 4, 1, 0, 0), 0.6)
@ -42,21 +49,24 @@ Example:
``` ```
### Working with dates ### Working with dates
With PyFacts, you never have to go into the hassle of creating datetime objects for your time series. PyFacts will parse any date passed to it as string. The default format is ISO format, i.e., YYYY-MM-DD. However, you can use your preferred format simply by specifying it in the options in datetime library compatible format, after importing the library. For example, to use DD-MM-YYY format: With PyFacts, you never have to go into the hassle of creating datetime objects for your time series. PyFacts will parse any date passed to it as string. The default format is ISO format, i.e., YYYY-MM-DD. However, you can use your preferred format simply by specifying it in the options in datetime library compatible format, after importing the library. For example, to use DD-MM-YYY format:
``` ```
>>> import pyfacts as pft >>> import pyfacts as pft
>>> pft.PyfactsOptions.date_format = '%d-%m-%Y' >>> pft.PyfactsOptions.date_format = '%d-%m-%Y'
``` ```
Now the library will automatically parse all dates as DD-MM-YYYY Now the library will automatically parse all dates as DD-MM-YYYY
If you happen to have any one situation where you need to use a different format, all methods accept a date_format parameter to override the default. If you happen to have any one situation where you need to use a different format, all methods accept a date_format parameter to override the default.
### Working with multiple time series ### Working with multiple time series
While working with time series data, you will often need to perform calculations on the data. PyFacts supports all kinds of mathematical operations on time series. While working with time series data, you will often need to perform calculations on the data. PyFacts supports all kinds of mathematical operations on time series.
Example: Example:
``` ```
>>> import pyfacts as pft >>> import pyfacts as pft
@ -83,6 +93,7 @@ TimeSeries([(datetime.datetime(2022, 1, 1, 0, 0), 0.1),
Mathematical operations can also be done between time series as long as they have the same dates. Mathematical operations can also be done between time series as long as they have the same dates.
Example: Example:
``` ```
>>> import pyfacts as pft >>> import pyfacts as pft
@ -110,6 +121,7 @@ TimeSeries([(datetime.datetime(2022, 1, 1, 0, 0), 1.0),
However, if the dates are not in sync, PyFacts provides convenience methods for syncronising dates. However, if the dates are not in sync, PyFacts provides convenience methods for syncronising dates.
Example: Example:
``` ```
>>> import pyfacts as pft >>> import pyfacts as pft
@ -146,6 +158,7 @@ TimeSeries([(datetime.datetime(2022, 1, 1, 0, 0), 20.0),
Even if you need to perform calculations on data with different frequencies, PyFacts will let you easily handle this with the expand and shrink methods. Even if you need to perform calculations on data with different frequencies, PyFacts will let you easily handle this with the expand and shrink methods.
Example: Example:
``` ```
>>> data = [ >>> data = [
... ("2022-01-01", 10), ... ("2022-01-01", 10),
@ -176,6 +189,7 @@ TimeSeries([(datetime.datetime(2022, 1, 1, 0, 0), 10.0),
If you want to shorten the timeframe of the data with an aggregation function, the transform method will help you out. Currently it supports sum and mean. If you want to shorten the timeframe of the data with an aggregation function, the transform method will help you out. Currently it supports sum and mean.
Example: Example:
``` ```
>>> data = [ >>> data = [
... ("2022-01-01", 10), ... ("2022-01-01", 10),
@ -208,11 +222,11 @@ TimeSeries([(datetime.datetime(2022, 1, 1, 0, 0), 12.0),
(datetime.datetime(2022, 10, 1, 0, 0), 30.0)], frequency='Q') (datetime.datetime(2022, 10, 1, 0, 0), 30.0)], frequency='Q')
``` ```
## To-do ## To-do
### Core features ### Core features
- [x] Add __setitem__
- [x] Add **setitem**
- [ ] Create emtpy TimeSeries object - [ ] Create emtpy TimeSeries object
- [x] Read from CSV - [x] Read from CSV
- [ ] Write to CSV - [ ] Write to CSV
@ -220,18 +234,20 @@ TimeSeries([(datetime.datetime(2022, 1, 1, 0, 0), 12.0),
- [x] Convert to list of tuples - [x] Convert to list of tuples
### pyfacts features ### pyfacts features
- [x] Sync two TimeSeries - [x] Sync two TimeSeries
- [x] Average rolling return - [x] Average rolling return
- [x] Sharpe ratio - [x] Sharpe ratio
- [x] Jensen's Alpha - [x] Jensen's Alpha
- [x] Beta - [x] Beta
- [ ] Sortino ratio - [x] Sortino ratio
- [x] Correlation & R-squared - [x] Correlation & R-squared
- [ ] Treynor ratio - [ ] Treynor ratio
- [x] Max drawdown - [x] Max drawdown
- [ ] Moving average - [ ] Moving average
### Pending implementation ### Pending implementation
- [x] Use limit parameter in ffill and bfill - [x] Use limit parameter in ffill and bfill
- [x] Implementation of ffill and bfill may be incorrect inside expand, check and correct - [x] Implementation of ffill and bfill may be incorrect inside expand, check and correct
- [ ] Implement interpolation in expand - [ ] Implement interpolation in expand

View File

@ -2,3 +2,26 @@ from .core import *
from .pyfacts import * from .pyfacts import *
from .statistics import * from .statistics import *
from .utils import * from .utils import *
__author__ = "Gourav Kumar"
__email__ = "gouravkr@outlook.in"
__version__ = "0.0.1"
__doc__ = """
PyFacts stands for Python library for Financial analysis and computations on time series.
It is a library which makes it simple to work with time series data.
Most libraries, and languages like SQL, work with rows. Operations are performed by rows
and not by dates. For instance, to calculate 1-year rolling returns in SQL, you are forced
to use either a lag of 365/252 rows, leading to an approximation, or slow and cumbersome
joins. PyFacts solves this by allowing you to work with dates and time intervals. Hence,
to calculate 1-year returns, you will be specifying a lag of 1-year and the library will
do the grunt work of finding the most appropriate observations to calculate these returns on.
PyFacts aims to simplify things by allowing you to:
* Compare time-series data based on dates and time-period-based lag
* Easy way to work around missing dates by taking the closest data points
* Completing series with missing data points using forward fill and backward fill
* Use friendly dates everywhere written as a simple string
"""

View File

@ -7,7 +7,7 @@ from typing import Literal
from pyfacts.core import date_parser from pyfacts.core import date_parser
from .pyfacts import TimeSeries from .pyfacts import TimeSeries, create_date_series
from .utils import _interval_to_years, _preprocess_from_to_date, covariance from .utils import _interval_to_years, _preprocess_from_to_date, covariance
# from dateutil.relativedelta import relativedelta # from dateutil.relativedelta import relativedelta
@ -587,3 +587,35 @@ def sortino_ratio(
sortino_ratio_value = excess_returns / sd sortino_ratio_value = excess_returns / sd
return sortino_ratio_value return sortino_ratio_value
@date_parser(3, 4)
def moving_average(
time_series_data: TimeSeries,
moving_average_period_unit: Literal["years", "months", "days"],
moving_average_period_value: int,
from_date: str | datetime.datetime = None,
to_date: str | datetime.datetime = None,
as_on_match: str = "closest",
prior_match: str = "closest",
closest: Literal["previous", "next"] = "previous",
date_format: str = None,
) -> TimeSeries:
from_date, to_date = _preprocess_from_to_date(
from_date,
to_date,
time_series_data,
False,
return_period_unit=moving_average_period_unit,
return_period_value=moving_average_period_value,
as_on_match=as_on_match,
prior_match=prior_match,
closest=closest,
)
dates = create_date_series(from_date, to_date, time_series_data.frequency.symbol)
for date in dates:
start_date = date - datetime.timedelta(**{moving_average_period_unit: moving_average_period_value})
time_series_data[start_date:date]