Compare commits

...

18 Commits

Author SHA1 Message Date
server 7a0b198c19 first stab at some files for pip 2020-02-22 11:16:06 -07:00
ml server 2a69646882 dont sys.exit() 2020-02-21 22:13:51 -07:00
server ae9e298a8b Voila start 2020-02-21 21:42:26 -07:00
ml server 2374027930 no print so much 2020-02-21 21:32:21 -07:00
server 7f80a61a45 Voila! 2020-02-21 21:28:40 -07:00
ml server 3829dafe64 ignore jupyter checkpoints 2020-02-21 21:03:43 -07:00
ml server c5daa6a182 add jupter config script 2020-02-21 21:02:54 -07:00
server c7b69b0175 jupyter install/config 2020-02-21 21:02:33 -07:00
ml server 3f4b4ae717 jupyter working with explicit filename... data_MATCH.csv 2020-02-21 20:59:34 -07:00
ml server c2e670841a move notebook to base dir 2020-02-21 19:53:35 -07:00
ml server fd5d54122e karoo_gp as jupyter notebook 2020-02-21 19:50:27 -07:00
ml server f7720564e0 blank... 2020-02-21 19:49:48 -07:00
server c7e7817c16 typo 2020-02-21 19:38:18 -07:00
server e39d23c46c clone to jupyter file 2020-02-21 19:36:57 -07:00
server 0fc2fbda0c update to tf2 2020-02-21 19:34:03 -07:00
server ab4249ae2c deps 2020-02-21 19:02:27 -07:00
server aad6d6aace ignore vi swapfiles 2020-02-21 19:02:13 -07:00
server fbb3e36899 Tensorflow 2.1 install 2020-02-21 18:58:21 -07:00
9 changed files with 1451 additions and 12 deletions

5
.gitignore vendored
View File

@ -1,3 +1,8 @@
.venv/
modules/__pycache__/
runs/
*.swp
.ipynb_checkpoints/
build/
dist/
karoo_gp_kstaats.egg-info/

7
README-pip.md 100644
View File

@ -0,0 +1,7 @@
Set up `setup.py`.
```
python3 -m pip install --user --upgrade setuptools wheel
python3 setup.py sdist bdist_wheel
```

146
README.md
View File

@ -19,7 +19,7 @@ Learn more at <a href="http://kstaats.github.io/karoo_gp/">kstaats.github.io/kar
# Debian Install
To install on Debian (Buster, Stable, 10):
To install on Debian (Buster/Stable/10):
```
# Install Dependencies:
@ -42,3 +42,147 @@ pip3 install numpy==1.16.4 sympy==1.4 tensorflow==1.13.1 scikit-learn==0.21.2
python3 karoo_gp.py
```
# Tensorflow 2 Install
Using karoo_gp with Tensorflow 2 on Debian (Buster/Stable/10).
```
# Install Dependencies:
sudo apt install python3 python3-pip python3-venv
# Clone repo (choose this one or upstream):
# git clone https://github.com/kstaats/karoo_gp
git clone https://spacecruft.org/spacecruft/karoo_gp
# Setup
cd karoo_gp
python3 -m venv .venv
source .venv/bin/activate
pip3 install --upgrade pip
pip3 install --upgrade setuptools
# Install python dependencies
pip3 install wheel
pip3 install sklearn sympy
# This will install Tensorflow 2.1:
pip3 install tensorflow
# Run application
python3 karoo_gp.py
```
# Jupyter
Jupyter is a cute interface to python.
## Jupyter Install
To install everything for jupyter, not in virt env... Thusly:
```
# Install Dependencies:
sudo apt install python3 python3-pip
# Clone repo (choose this one or upstream):
git clone https://spacecruft.org/spacecruft/karoo_gp
# Setup
cd karoo_gp
git checkout jupyter
pip3 install --user --upgrade pip
pip3 install --user --upgrade setuptools
# Install python dependencies
pip3 install --user wheel
pip3 install --user sklearn sympy
# This will install Tensorflow 2.1:
pip3 install --user tensorflow
```
## Run Jupyter
Copy the `foo` to an appropriate place. Run similar to this:
```
jupyter-lab \
--config=/home/jebba/.jupyter/jupyter_karoo_gp_config.py \
--debug \
--notebook-dir=/home/jebba/devel/spacecruft/karoo_gp \
--app-dir=/home/jebba/.local/share/jupyter/lab \
1>>/home/jebba/log/karoo.log 2>>/home/jebba/log/karoo.err &
```
# Voila
Voila can take Jupyter notebooks and turn them into websites.
Example site below:
* https://karoo.spacecruft.org/
## Install Voila
See other docs, maybe in satnogs-wut.
```
# Install Dependencies:
sudo apt install python3 python3-pip python3-venv
# Clone repo:
git clone https://spacecruft.org/spacecruft/karoo_gp
# Setup
cd karoo_gp
git checkout jupyter
pip3 install --user --upgrade pip
pip3 install --user --upgrade setuptools
# Install python dependencies
pip3 install --user wheel
pip3 install --user sklearn sympy
# This will install Tensorflow 2.1:
pip3 install --user tensorflow
```
## Voila Apache
Snippet of apache code to set up proxy using Certbot.
```
<VirtualHost karoo.spacecruft.org:443>
ServerAdmin webmaster@localhost
DocumentRoot /var/www/html
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
RewriteEngine on
#LoadModule proxy_module modules/mod_proxy.so
#LoadModule proxy_http_module modules/mod_proxy_http.so
RequestHeader set X-Forwarded-Proto 'https'env=HTTPS
RewriteCond %{HTTP:UPGRADE} ^WebSocket$ [NC]
RewriteCond %{HTTP:CONNECTION} Upgrade$ [NC]
RewriteRule /(.*) ws://127.0.0.1:8221/$1 [P]
<Location />
ProxyPass http://127.0.0.1:8221/
</Location>
ProxyVia On
ProxyPreserveHost On
Include /etc/letsencrypt/options-ssl-apache.conf
ServerName karoo.spacecruft.org
SSLCertificateFile /etc/letsencrypt/live/wut.spacecruft.org/fullchain.pem
SSLCertificateKeyFile /etc/letsencrypt/live/wut.spacecruft.org/privkey.pem
</VirtualHost>
```
## Voila Start
Start with `voila` something like this:
```
voila \
--ExecutePreprocessor.timeout=600 \
--no-browser \
--port=8221 \
--theme=dark \
--autoreload=True \
--template=default \
--Voila.ip=localhost \
--VoilaConfiguration.enable_nbextensions=False \
karoo_gp.ipynb
```

0
__init__.py 100644
View File

View File

@ -0,0 +1,964 @@
# Configuration file for jupyter-notebook.
#------------------------------------------------------------------------------
# Application(SingletonConfigurable) configuration
#------------------------------------------------------------------------------
## This is an application.
## The date format used by logging formatters for %(asctime)s
#c.Application.log_datefmt = '%Y-%m-%d %H:%M:%S'
## The Logging format template
#c.Application.log_format = '[%(name)s]%(highlevel)s %(message)s'
## Set the log level by value or name.
#c.Application.log_level = 30
#------------------------------------------------------------------------------
# JupyterApp(Application) configuration
#------------------------------------------------------------------------------
## Base class for Jupyter applications
## Answer yes to any prompts.
#c.JupyterApp.answer_yes = False
## Full path of a config file.
#c.JupyterApp.config_file = '/home/jebba/.jupyter/jupyter_notebook_config.py'
## Specify a config file to load.
#c.JupyterApp.config_file_name = '/home/jebba/.jupyter/jupyter_notebook_config.py'
## Generate default config file.
#c.JupyterApp.generate_config = False
#------------------------------------------------------------------------------
# NotebookApp(JupyterApp) configuration
#------------------------------------------------------------------------------
## Set the Access-Control-Allow-Credentials: true header
#c.NotebookApp.allow_credentials = False
## Set the Access-Control-Allow-Origin header
#
# Use '*' to allow any origin to access your server.
#
# Takes precedence over allow_origin_pat.
# XXX XXX XXX
c.NotebookApp.allow_origin = '*'
## Use a regular expression for the Access-Control-Allow-Origin header
#
# Requests from an origin matching the expression will get replies with:
#
# Access-Control-Allow-Origin: origin
#
# where `origin` is the origin of the request.
#
# Ignored if allow_origin is set.
# XXX
#c.NotebookApp.allow_origin_pat = ''
## Allow password to be changed at login for the notebook server.
#
# While loggin in with a token, the notebook server UI will give the opportunity
# to the user to enter a new password at the same time that will replace the
# token login mechanism.
#
# This can be set to false to prevent changing password from the UI/API.
# XXX
#c.NotebookApp.allow_password_change = True
## Allow requests where the Host header doesn't point to a local server
#
# By default, requests get a 403 forbidden response if the 'Host' header shows
# that the browser thinks it's on a non-local domain. Setting this option to
# True disables this check.
#
# This protects against 'DNS rebinding' attacks, where a remote web server
# serves you a page and then changes its DNS to send later requests to a local
# IP, bypassing same-origin checks.
#
# Local IP addresses (such as 127.0.0.1 and ::1) are allowed as local, along
# with hostnames configured in local_hostnames.
# XXX
c.NotebookApp.allow_remote_access = True
## Whether to allow the user to run the notebook as root.
c.NotebookApp.allow_root = False
## DEPRECATED use base_url
#c.NotebookApp.base_project_url = '/'
## The base URL for the notebook server.
#
# Leading and trailing slashes can be omitted, and will automatically be added.
# XXX
c.NotebookApp.base_url = 'jupyter'
## Specify what command to use to invoke a web browser when opening the notebook.
# If not specified, the default browser will be determined by the `webbrowser`
# standard library module, which allows setting of the BROWSER environment
# variable to override it.
#c.NotebookApp.browser = ''
## The full path to an SSL/TLS certificate file.
#c.NotebookApp.certfile = ''
## The full path to a certificate authority certificate for SSL/TLS client
# authentication.
#c.NotebookApp.client_ca = ''
## The config manager class to use
#c.NotebookApp.config_manager_class = 'notebook.services.config.manager.ConfigManager'
## The notebook manager class to use.
#c.NotebookApp.contents_manager_class = 'notebook.services.contents.largefilemanager.LargeFileManager'
## Extra keyword arguments to pass to `set_secure_cookie`. See tornado's
# set_secure_cookie docs for details.
#c.NotebookApp.cookie_options = {}
## The random bytes used to secure cookies. By default this is a new random
# number every time you start the Notebook. Set it to a value in a config file
# to enable logins to persist across server sessions.
#
# Note: Cookie secrets should be kept private, do not share config files with
# cookie_secret stored in plaintext (you can read the value from a file).
#c.NotebookApp.cookie_secret = b''
## The file where the cookie secret is stored.
#c.NotebookApp.cookie_secret_file = ''
## Override URL shown to users.
#
# Replace actual URL, including protocol, address, port and base URL, with the
# given value when displaying URL to the users. Do not change the actual
# connection URL. If authentication token is enabled, the token is added to the
# custom URL automatically.
#
# This option is intended to be used when the URL to display to the user cannot
# be determined reliably by the Jupyter notebook server (proxified or
# containerized setups for example).
c.NotebookApp.custom_display_url = 'https://ml.spacecruft.org/jupyter'
## The default URL to redirect to from `/`
# XXX
#c.NotebookApp.default_url = 'jupyter'
## Disable cross-site-request-forgery protection
#
# Jupyter notebook 4.3.1 introduces protection from cross-site request
# forgeries, requiring API requests to either:
#
# - originate from pages served by this server (validated with XSRF cookie and
# token), or - authenticate with a token
#
# Some anonymous compute resources still desire the ability to run code,
# completely without authentication. These services can disable all
# authentication and security checks, with the full knowledge of what that
# implies.
# XXX
c.NotebookApp.disable_check_xsrf = False
# XXX
## Whether to enable MathJax for typesetting math/TeX
#
# MathJax is the javascript library Jupyter uses to render math/LaTeX. It is
# very large, so you may want to disable it if you have a slow internet
# connection, or for offline use of the notebook.
#
# When disabled, equations etc. will appear as their untransformed TeX source.
#c.NotebookApp.enable_mathjax = True
## extra paths to look for Javascript notebook extensions
#c.NotebookApp.extra_nbextensions_path = []
## handlers that should be loaded at higher priority than the default services
#c.NotebookApp.extra_services = []
## Extra paths to search for serving static files.
#
# This allows adding javascript/css to be available from the notebook server
# machine, or overriding individual files in the IPython
# XXX
#c.NotebookApp.extra_static_paths = []
## Extra paths to search for serving jinja templates.
#
# Can be used to override templates from notebook.templates.
#c.NotebookApp.extra_template_paths = []
##
# XXX
#c.NotebookApp.file_to_run = ''
## Extra keyword arguments to pass to `get_secure_cookie`. See tornado's
# get_secure_cookie docs for details.
#c.NotebookApp.get_secure_cookie_kwargs = {}
## Deprecated: Use minified JS file or not, mainly use during dev to avoid JS
# recompilation
#c.NotebookApp.ignore_minified_js = False
## (bytes/sec) Maximum rate at which stream output can be sent on iopub before
# they are limited.
#c.NotebookApp.iopub_data_rate_limit = 1000000
## (msgs/sec) Maximum rate at which messages can be sent on iopub before they are
# limited.
#c.NotebookApp.iopub_msg_rate_limit = 1000
## The IP address the notebook server will listen on.
c.NotebookApp.ip = '127.0.0.1'
## Supply extra arguments that will be passed to Jinja environment.
#c.NotebookApp.jinja_environment_options = {}
## Extra variables to supply to jinja templates when rendering.
#c.NotebookApp.jinja_template_vars = {}
## The kernel manager class to use.
#c.NotebookApp.kernel_manager_class = 'notebook.services.kernels.kernelmanager.MappingKernelManager'
## The kernel spec manager class to use. Should be a subclass of
# `jupyter_client.kernelspec.KernelSpecManager`.
#
# The Api of KernelSpecManager is provisional and might change without warning
# between this version of Jupyter and the next stable one.
#c.NotebookApp.kernel_spec_manager_class = 'jupyter_client.kernelspec.KernelSpecManager'
## The full path to a private key file for usage with SSL/TLS.
#c.NotebookApp.keyfile = ''
## Hostnames to allow as local when allow_remote_access is False.
#
# Local IP addresses (such as 127.0.0.1 and ::1) are automatically accepted as
# local as well.
#c.NotebookApp.local_hostnames = ['localhost']
c.NotebookApp.local_hostnames = ['127.0.0.1', '185.165.171.164', 'ml.spacecruft.org']
## The login handler class to use.
#c.NotebookApp.login_handler_class = 'notebook.auth.login.LoginHandler'
## The logout handler class to use.
#c.NotebookApp.logout_handler_class = 'notebook.auth.logout.LogoutHandler'
## The MathJax.js configuration file that is to be used.
#c.NotebookApp.mathjax_config = 'TeX-AMS-MML_HTMLorMML-full,Safe'
## A custom url for MathJax.js. Should be in the form of a case-sensitive url to
# MathJax, for example: /static/components/MathJax/MathJax.js
#c.NotebookApp.mathjax_url = ''
## Sets the maximum allowed size of the client request body, specified in the
# Content-Length request header field. If the size in a request exceeds the
# configured value, a malformed HTTP message is returned to the client.
#
# Note: max_body_size is applied even in streaming mode.
#c.NotebookApp.max_body_size = 536870912
# XXX Default is 512 Megabytes: 536870912 bytes
# Perhaps do 4x more?
## Gets or sets the maximum amount of memory, in bytes, that is allocated for
# use by the buffer manager.
#c.NotebookApp.max_buffer_size = 536870912
c.NotebookApp.max_buffer_size = 2147483648
# XXX Number of open files.
## Gets or sets a lower bound on the open file handles process resource limit.
# This may need to be increased if you run into an OSError: [Errno 24] Too many
# open files. This is not applicable when running on Windows.
#c.NotebookApp.min_open_files_limit = 4096
c.NotebookApp.min_open_files_limit = 16384
## Dict of Python modules to load as notebook server extensions.Entry values can
# be used to enable and disable the loading ofthe extensions. The extensions
# will be loaded in alphabetical order.
#c.NotebookApp.nbserver_extensions = {}
## The directory to use for notebooks and kernels.
# XXX
#c.NotebookApp.notebook_dir = ''
## Whether to open in a browser after starting. The specific browser used is
# platform dependent and determined by the python standard library `webbrowser`
# module, unless it is overridden using the --browser (NotebookApp.browser)
# configuration option.
c.NotebookApp.open_browser = False
## Hashed password to use for web authentication.
#
# To generate, type in a python/IPython shell:
#
# from notebook.auth import passwd; passwd()
#
# The string should be of the form type:salt:hashed-password.
# XXX
#c.NotebookApp.password = ''
## Forces users to use a password for the Notebook server. This is useful in a
# multi user environment, for instance when everybody in the LAN can access each
# other's machine through ssh.
#
# In such a case, server the notebook server on localhost is not secure since
# any user can connect to the notebook server via ssh.
# XXX
#c.NotebookApp.password_required = False
## The port the notebook server will listen on.
c.NotebookApp.port = 9999
## The number of additional ports to try if the specified port is not available.
c.NotebookApp.port_retries = 0
## DISABLED: use %pylab or %matplotlib in the notebook to enable matplotlib.
#c.NotebookApp.pylab = 'disabled'
## If True, display a button in the dashboard to quit (shutdown the notebook
# server).
#c.NotebookApp.quit_button = True
## (sec) Time window used to check the message and data rate limits.
#c.NotebookApp.rate_limit_window = 3
## Reraise exceptions encountered loading server extensions?
#c.NotebookApp.reraise_server_extension_failures = False
## DEPRECATED use the nbserver_extensions dict instead
#c.NotebookApp.server_extensions = []
## The session manager class to use.
#c.NotebookApp.session_manager_class = 'notebook.services.sessions.sessionmanager.SessionManager'
## Shut down the server after N seconds with no kernels or terminals running and
# no activity. This can be used together with culling idle kernels
# (MappingKernelManager.cull_idle_timeout) to shutdown the notebook server when
# it's not in use. This is not precisely timed: it may shut down up to a minute
# later. 0 (the default) disables this automatic shutdown.
#c.NotebookApp.shutdown_no_activity_timeout = 0
## Supply SSL options for the tornado HTTPServer. See the tornado docs for
# details.
#c.NotebookApp.ssl_options = {}
## Supply overrides for terminado. Currently only supports "shell_command".
#c.NotebookApp.terminado_settings = {}
## Set to False to disable terminals.
#
# This does *not* make the notebook server more secure by itself. Anything the
# user can in a terminal, they can also do in a notebook.
#
# Terminals may also be automatically disabled if the terminado package is not
# available.
#c.NotebookApp.terminals_enabled = True
## Token used for authenticating first-time connections to the server.
#
# When no password is enabled, the default is to generate a new, random token.
#
# Setting to an empty string disables authentication altogether, which is NOT
# RECOMMENDED.
#c.NotebookApp.token = '<generated>'
## Supply overrides for the tornado.web.Application that the Jupyter notebook
# uses.
#c.NotebookApp.tornado_settings = {}
## Whether to trust or not X-Scheme/X-Forwarded-Proto and X-Real-Ip/X-Forwarded-
# For headerssent by the upstream reverse proxy. Necessary if the proxy handles
# SSL
c.NotebookApp.trust_xheaders = True
## Disable launching browser by redirect file
#
# For versions of notebook > 5.7.2, a security feature measure was added that
# prevented the authentication token used to launch the browser from being
# visible. This feature makes it difficult for other users on a multi-user
# system from running code in your Jupyter session as you.
#
# However, some environments (like Windows Subsystem for Linux (WSL) and
# Chromebooks), launching a browser using a redirect file can lead the browser
# failing to load. This is because of the difference in file structures/paths
# between the runtime and the browser.
#
# Disabling this setting to False will disable this behavior, allowing the
# browser to launch by using a URL and visible token (as before).
#c.NotebookApp.use_redirect_file = True
## DEPRECATED, use tornado_settings
#c.NotebookApp.webapp_settings = {}
## Specify Where to open the notebook on startup. This is the `new` argument
# passed to the standard library method `webbrowser.open`. The behaviour is not
# guaranteed, but depends on browser support. Valid values are:
#
# - 2 opens a new tab,
# - 1 opens a new window,
# - 0 opens in an existing window.
#
# See the `webbrowser.open` documentation for details.
#c.NotebookApp.webbrowser_open_new = 2
## Set the tornado compression options for websocket connections.
#
# This value will be returned from
# :meth:`WebSocketHandler.get_compression_options`. None (default) will disable
# compression. A dict (even an empty one) will enable compression.
#
# See the tornado docs for WebSocketHandler.get_compression_options for details.
#c.NotebookApp.websocket_compression_options = None
## The base URL for websockets, if it differs from the HTTP server (hint: it
# almost certainly doesn't).
#
# Should be in the form of an HTTP origin: ws[s]://hostname[:port]
# XXX
#c.NotebookApp.websocket_url = ''
#------------------------------------------------------------------------------
# LabApp(NotebookApp) configuration
#------------------------------------------------------------------------------
## The app directory to launch JupyterLab from.
c.LabApp.app_dir = '/home/jebba/.local/share/jupyter/lab'
## Whether to start the app in core mode. In this mode, JupyterLab will run using
# the JavaScript assets that are within the installed JupyterLab Python package.
# In core mode, third party extensions are disabled. The `--dev-mode` flag is an
# alias to this to be used when the Python package itself is installed in
# development mode (`pip install -e .`).
#c.LabApp.core_mode = False
## The default URL to redirect to from `/`
# XXX XXX
c.LabApp.default_url = '/jupyter/lab'
## Whether to start the app in dev mode. Uses the unpublished local JavaScript
# packages in the `dev_mode` folder. In this case JupyterLab will show a red
# stripe at the top of the page. It can only be used if JupyterLab is installed
# as `pip install -e .`.
#c.LabApp.dev_mode = False
## The override url for static lab assets, typically a CDN.
#c.LabApp.override_static_url = ''
# XXX
#c.LabApp.override_static_url = '/static'
## The override url for static lab theme assets, typically a CDN.
# XXX
#c.LabApp.override_theme_url = 'https://ml.spacecruft.org/jupyter'
## The directory for user settings.
# XXX
#c.LabApp.user_settings_dir = '/home/jebba/.jupyter/lab/user-settings'
## Whether to serve the app in watch mode
# XXX
#c.LabApp.watch = False
## The directory for workspaces
# XXX
#c.LabApp.workspaces_dir = '/home/jebba/.jupyter/lab/workspaces'
#------------------------------------------------------------------------------
# ConnectionFileMixin(LoggingConfigurable) configuration
#------------------------------------------------------------------------------
## Mixin for configurable classes that work with connection files
## JSON file in which to store connection info [default: kernel-<pid>.json]
#
# This file will contain the IP, ports, and authentication key needed to connect
# clients to this kernel. By default, this file will be created in the security
# dir of the current profile, but can be specified by absolute path.
#c.ConnectionFileMixin.connection_file = ''
## set the control (ROUTER) port [default: random]
#c.ConnectionFileMixin.control_port = 0
## set the heartbeat port [default: random]
#c.ConnectionFileMixin.hb_port = 0
## set the iopub (PUB) port [default: random]
#c.ConnectionFileMixin.iopub_port = 0
## Set the kernel's IP address [default localhost]. If the IP address is
# something other than localhost, then Consoles on other machines will be able
# to connect to the Kernel, so be careful!
c.ConnectionFileMixin.ip = '127.0.0.1'
## set the shell (ROUTER) port [default: random]
#c.ConnectionFileMixin.shell_port = 0
## set the stdin (ROUTER) port [default: random]
#c.ConnectionFileMixin.stdin_port = 0
##
#c.ConnectionFileMixin.transport = 'tcp'
#------------------------------------------------------------------------------
# KernelManager(ConnectionFileMixin) configuration
#------------------------------------------------------------------------------
## Manages a single kernel in a subprocess on this host.
#
# This version starts kernels with Popen.
# XXX
## Should we autorestart the kernel if it dies.
#c.KernelManager.autorestart = True
c.KernelManager.autorestart = False
## DEPRECATED: Use kernel_name instead.
#
# The Popen Command to launch the kernel. Override this if you have a custom
# kernel. If kernel_cmd is specified in a configuration file, Jupyter does not
# pass any arguments to the kernel, because it cannot make any assumptions about
# the arguments that the kernel understands. In particular, this means that the
# kernel does not receive the option --debug if it given on the Jupyter command
# line.
#c.KernelManager.kernel_cmd = []
# XXX
## Time to wait for a kernel to terminate before killing it, in seconds.
#c.KernelManager.shutdown_wait_time = 5.0
#------------------------------------------------------------------------------
# Session(Configurable) configuration
#------------------------------------------------------------------------------
## Object for handling serialization and sending of messages.
#
# The Session object handles building messages and sending them with ZMQ sockets
# or ZMQStream objects. Objects can communicate with each other over the
# network via Session objects, and only need to work with the dict-based IPython
# message spec. The Session will handle serialization/deserialization, security,
# and metadata.
#
# Sessions support configurable serialization via packer/unpacker traits, and
# signing with HMAC digests via the key/keyfile traits.
#
# Parameters ----------
#
# debug : bool
# whether to trigger extra debugging statements
# packer/unpacker : str : 'json', 'pickle' or import_string
# importstrings for methods to serialize message parts. If just
# 'json' or 'pickle', predefined JSON and pickle packers will be used.
# Otherwise, the entire importstring must be used.
#
# The functions must accept at least valid JSON input, and output *bytes*.
#
# For example, to use msgpack:
# packer = 'msgpack.packb', unpacker='msgpack.unpackb'
# pack/unpack : callables
# You can also set the pack/unpack callables for serialization directly.
# session : bytes
# the ID of this Session object. The default is to generate a new UUID.
# username : unicode
# username added to message headers. The default is to ask the OS.
# key : bytes
# The key used to initialize an HMAC signature. If unset, messages
# will not be signed or checked.
# keyfile : filepath
# The file containing a key. If this is set, `key` will be initialized
# to the contents of the file.
## Threshold (in bytes) beyond which an object's buffer should be extracted to
# avoid pickling.
#c.Session.buffer_threshold = 1024
## Whether to check PID to protect against calls after fork.
#
# This check can be disabled if fork-safety is handled elsewhere.
#c.Session.check_pid = True
# XXX
## Threshold (in bytes) beyond which a buffer should be sent without copying.
#c.Session.copy_threshold = 65536
## Debug output in the Session
#c.Session.debug = False
## The maximum number of digests to remember.
#
# The digest history will be culled when it exceeds this value.
#c.Session.digest_history_size = 65536
## The maximum number of items for a container to be introspected for custom
# serialization. Containers larger than this are pickled outright.
#c.Session.item_threshold = 64
## execution key, for signing messages.
#c.Session.key = b''
## path to file containing execution key.
#c.Session.keyfile = ''
## Metadata dictionary, which serves as the default top-level metadata dict for
# each message.
#c.Session.metadata = {}
## The name of the packer for serializing messages. Should be one of 'json',
# 'pickle', or an import name for a custom callable serializer.
#c.Session.packer = 'json'
## The UUID identifying this session.
#c.Session.session = ''
## The digest scheme used to construct the message signatures. Must have the form
# 'hmac-HASH'.
#c.Session.signature_scheme = 'hmac-sha256'
## The name of the unpacker for unserializing messages. Only used with custom
# functions for `packer`.
#c.Session.unpacker = 'json'
## Username for the Session. Default is your system username.
# XXX
#c.Session.username = 'jebba'
#------------------------------------------------------------------------------
# MultiKernelManager(LoggingConfigurable) configuration
#------------------------------------------------------------------------------
## A class for managing multiple kernels.
## The name of the default kernel to start
#c.MultiKernelManager.default_kernel_name = 'python3'
## The kernel manager class. This is configurable to allow subclassing of the
# KernelManager for customized behavior.
#c.MultiKernelManager.kernel_manager_class = 'jupyter_client.ioloop.IOLoopKernelManager'
#------------------------------------------------------------------------------
# MappingKernelManager(MultiKernelManager) configuration
#------------------------------------------------------------------------------
## A KernelManager that handles notebook mapping and HTTP error handling
## White list of allowed kernel message types. When the list is empty, all
# message types are allowed.
#c.MappingKernelManager.allowed_message_types = []
## Whether messages from kernels whose frontends have disconnected should be
# buffered in-memory.
#
# When True (default), messages are buffered and replayed on reconnect, avoiding
# lost messages due to interrupted connectivity.
#
# XXX
# Disable if long-running kernels will produce too much output while no
# frontends are connected.
#c.MappingKernelManager.buffer_offline_messages = True
# XXX
## Whether to consider culling kernels which are busy. Only effective if
# cull_idle_timeout > 0.
#c.MappingKernelManager.cull_busy = False
# XXX
## Whether to consider culling kernels which have one or more connections. Only
# effective if cull_idle_timeout > 0.
#c.MappingKernelManager.cull_connected = False
## Timeout (in seconds) after which a kernel is considered idle and ready to be
# culled. Values of 0 or lower disable culling. Very short timeouts may result
# in kernels being culled for users with poor network connections.
#c.MappingKernelManager.cull_idle_timeout = 0
# XXX
## The interval (in seconds) on which to check for idle kernels exceeding the
# cull timeout value.
#c.MappingKernelManager.cull_interval = 300
# XXX
## Timeout for giving up on a kernel (in seconds).
#
# On starting and restarting kernels, we check whether the kernel is running and
# responsive by sending kernel_info_requests. This sets the timeout in seconds
# for how long the kernel can take before being presumed dead. This affects the
# MappingKernelManager (which handles kernel restarts) and the
# ZMQChannelsHandler (which handles the startup).
#c.MappingKernelManager.kernel_info_timeout = 60
c.MappingKernelManager.kernel_info_timeout = 480
##
#c.MappingKernelManager.root_dir = ''
#------------------------------------------------------------------------------
# KernelSpecManager(LoggingConfigurable) configuration
#------------------------------------------------------------------------------
## If there is no Python kernelspec registered and the IPython kernel is
# available, ensure it is added to the spec list.
#c.KernelSpecManager.ensure_native_kernel = True
## The kernel spec class. This is configurable to allow subclassing of the
# KernelSpecManager for customized behavior.
#c.KernelSpecManager.kernel_spec_class = 'jupyter_client.kernelspec.KernelSpec'
## Whitelist of allowed kernel names.
#
# By default, all installed kernels are allowed.
#c.KernelSpecManager.whitelist = set()
#------------------------------------------------------------------------------
# ContentsManager(LoggingConfigurable) configuration
#------------------------------------------------------------------------------
## Base class for serving files and directories.
#
# This serves any text or binary file, as well as directories, with special
# handling for JSON notebook documents.
#
# Most APIs take a path argument, which is always an API-style unicode path, and
# always refers to a directory.
#
# - unicode, not url-escaped
# - '/'-separated
# - leading and trailing '/' will be stripped
# - if unspecified, path defaults to '',
# indicating the root path.
## Allow access to hidden files
c.ContentsManager.allow_hidden = False
##
#c.ContentsManager.checkpoints = None
##
#c.ContentsManager.checkpoints_class = 'notebook.services.contents.checkpoints.Checkpoints'
##
#c.ContentsManager.checkpoints_kwargs = {}
## handler class to use when serving raw file requests.
#
# Default is a fallback that talks to the ContentsManager API, which may be
# inefficient, especially for large files.
#
# Local files-based ContentsManagers can use a StaticFileHandler subclass, which
# will be much more efficient.
#
# Access to these files should be Authenticated.
#c.ContentsManager.files_handler_class = 'notebook.files.handlers.FilesHandler'
## Extra parameters to pass to files_handler_class.
#
# For example, StaticFileHandlers generally expect a `path` argument specifying
# the root directory from which to serve files.
#c.ContentsManager.files_handler_params = {}
## Glob patterns to hide in file and directory listings.
#c.ContentsManager.hide_globs = ['__pycache__', '*.pyc', '*.pyo', '.DS_Store', '*.so', '*.dylib', '*~']
## Python callable or importstring thereof
#
# To be called on a contents model prior to save.
#
# This can be used to process the structure, such as removing notebook outputs
# or other side effects that should not be saved.
#
# It will be called as (all arguments passed by keyword)::
#
# hook(path=path, model=model, contents_manager=self)
#
# - model: the model to be saved. Includes file contents.
# Modifying this dict will affect the file that is stored.
# - path: the API path of the save destination
# - contents_manager: this ContentsManager instance
#c.ContentsManager.pre_save_hook = None
##
# XXX
#c.ContentsManager.root_dir = '/'
## The base name used when creating untitled directories.
c.ContentsManager.untitled_directory = 'untitled_folder'
## The base name used when creating untitled files.
c.ContentsManager.untitled_file = 'untitled'
## The base name used when creating untitled notebooks.
c.ContentsManager.untitled_notebook = 'notebook'
#------------------------------------------------------------------------------
# FileManagerMixin(Configurable) configuration
#------------------------------------------------------------------------------
## Mixin for ContentsAPI classes that interact with the filesystem.
#
# Provides facilities for reading, writing, and copying both notebooks and
# generic files.
#
# Shared by FileContentsManager and FileCheckpoints.
#
# Note ---- Classes using this mixin must provide the following attributes:
#
# root_dir : unicode
# A directory against against which API-style paths are to be resolved.
#
# log : logging.Logger
## By default notebooks are saved on disk on a temporary file and then if
# successfully written, it replaces the old ones. This procedure, namely
# 'atomic_writing', causes some bugs on file system without operation order
# enforcement (like some networked fs). If set to False, the new notebook is
# written directly on the old one which could fail (eg: full filesystem or quota
# )
#c.FileManagerMixin.use_atomic_writing = True
#------------------------------------------------------------------------------
# FileContentsManager(FileManagerMixin,ContentsManager) configuration
#------------------------------------------------------------------------------
## If True (default), deleting files will send them to the platform's
# trash/recycle bin, where they can be recovered. If False, deleting files
# really deletes them.
c.FileContentsManager.delete_to_trash = False
## Python callable or importstring thereof
#
# to be called on the path of a file just saved.
#
# This can be used to process the file on disk, such as converting the notebook
# to a script or HTML via nbconvert.
#
# It will be called as (all arguments passed by keyword)::
#
# hook(os_path=os_path, model=model, contents_manager=instance)
#
# - path: the filesystem path to the file just written - model: the model
# representing the file - contents_manager: this ContentsManager instance
#c.FileContentsManager.post_save_hook = None
##
# XXX
#c.FileContentsManager.root_dir = ''
## DEPRECATED, use post_save_hook. Will be removed in Notebook 5.0
#c.FileContentsManager.save_script = False
#------------------------------------------------------------------------------
# NotebookNotary(LoggingConfigurable) configuration
#------------------------------------------------------------------------------
## A class for computing and verifying notebook signatures.
## The hashing algorithm used to sign notebooks.
#c.NotebookNotary.algorithm = 'sha256'
c.NotebookNotary.algorithm = 'sha512'
## The sqlite file in which to store notebook signatures. By default, this will
# be in your Jupyter data directory. You can set it to ':memory:' to disable
# sqlite writing to the filesystem.
# XXX
#c.NotebookNotary.db_file = ''
## The secret key with which notebooks are signed.
# XXX
#c.NotebookNotary.secret = b''
## The file where the secret key is stored.
# XXX
#c.NotebookNotary.secret_file = ''
## A callable returning the storage backend for notebook signatures. The default
# uses an SQLite database.
# XXX
#c.NotebookNotary.store_factory = traitlets.Undefined
#------------------------------------------------------------------------------
# GatewayKernelManager(MappingKernelManager) configuration
#------------------------------------------------------------------------------
## Kernel manager that supports remote kernels hosted by Jupyter Kernel or
# Enterprise Gateway.
#------------------------------------------------------------------------------
# GatewayKernelSpecManager(KernelSpecManager) configuration
#------------------------------------------------------------------------------
#------------------------------------------------------------------------------
# GatewayClient(SingletonConfigurable) configuration
#------------------------------------------------------------------------------
## This class manages the configuration. It's its own singleton class so that we
# can share these values across all objects. It also contains some helper methods
# to build request arguments out of the various config options.
## The authorization token used in the HTTP headers. (JUPYTER_GATEWAY_AUTH_TOKEN
# env var)
#c.GatewayClient.auth_token = None
## The filename of CA certificates or None to use defaults.
# (JUPYTER_GATEWAY_CA_CERTS env var)
#c.GatewayClient.ca_certs = None
## The filename for client SSL certificate, if any. (JUPYTER_GATEWAY_CLIENT_CERT
# env var)
#c.GatewayClient.client_cert = None
## The filename for client SSL key, if any. (JUPYTER_GATEWAY_CLIENT_KEY env var)
#c.GatewayClient.client_key = None
## The time allowed for HTTP connection establishment with the Gateway server.
# (JUPYTER_GATEWAY_CONNECT_TIMEOUT env var)
#c.GatewayClient.connect_timeout = 60.0
## A comma-separated list of environment variable names that will be included,
# along with their values, in the kernel startup request. The corresponding
# `env_whitelist` configuration value must also be set on the Gateway server -
# since that configuration value indicates which environmental values to make
# available to the kernel. (JUPYTER_GATEWAY_ENV_WHITELIST env var)
#c.GatewayClient.env_whitelist = ''
## Additional HTTP headers to pass on the request. This value will be converted
# to a dict. (JUPYTER_GATEWAY_HEADERS env var)
#c.GatewayClient.headers = '{}'
## The password for HTTP authentication. (JUPYTER_GATEWAY_HTTP_PWD env var)
#c.GatewayClient.http_pwd = None
## The username for HTTP authentication. (JUPYTER_GATEWAY_HTTP_USER env var)
#c.GatewayClient.http_user = None
## The gateway API endpoint for accessing kernel resources
# (JUPYTER_GATEWAY_KERNELS_ENDPOINT env var)
# XXX
#c.GatewayClient.kernels_endpoint = '/jupyter/api/kernels'
## The gateway API endpoint for accessing kernelspecs
# (JUPYTER_GATEWAY_KERNELSPECS_ENDPOINT env var)
# XXX
#c.GatewayClient.kernelspecs_endpoint = '/jupyter/api/kernelspecs'
## The gateway endpoint for accessing kernelspecs resources
# (JUPYTER_GATEWAY_KERNELSPECS_RESOURCE_ENDPOINT env var)
# XXX
#c.GatewayClient.kernelspecs_resource_endpoint = '/jupyter/kernelspecs'
## The time allowed for HTTP request completion. (JUPYTER_GATEWAY_REQUEST_TIMEOUT
# env var)
#c.GatewayClient.request_timeout = 60.0
## The url of the Kernel or Enterprise Gateway server where kernel specifications
# are defined and kernel management takes place. If defined, this Notebook
# server acts as a proxy for all kernel management and kernel specification
# retrieval. (JUPYTER_GATEWAY_URL env var)
# XXX
#c.GatewayClient.url = None
## For HTTPS requests, determines if server's certificate should be validated or
# not. (JUPYTER_GATEWAY_VALIDATE_CERT env var)
#c.GatewayClient.validate_cert = True
## The websocket url of the Kernel or Enterprise Gateway server. If not
# provided, this value will correspond to the value of the Gateway url with 'ws'
# in place of 'http'. (JUPYTER_GATEWAY_WS_URL env var)
# XXX
#c.GatewayClient.ws_url = None

291
karoo_gp.ipynb 100644
View File

@ -0,0 +1,291 @@
{
"cells": [
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# Karoo GP (desktop + server combined)\n",
"# Use Genetic Programming for Classification and Symbolic Regression\n",
"# by Kai Staats, MSc with TensorFlow support provided by Iurii Milovanov; see LICENSE.md\n",
"# version 2.3 for Python 3.6\n",
"\n",
"'''\n",
"A word to the newbie, expert, and brave--\n",
"Even if you are highly experienced in Genetic Programming, it is recommended that you review the 'Karoo User Guide' \n",
"before running this application. While your computer will not burst into flames nor will the sun collapse into a black \n",
"hole if you do not, you will likely find more enjoyment of this particular flavour of GP with a little understanding \n",
"of its intent and design.\n",
"\n",
"Without any command line arguments, Karoo GP relies upon user settings and the datasets located in karoo_gp/files/.\n",
"\n",
"\t$ python karoo_gp_main.py\n",
"\t\n",
"\n",
"If you include the path to an external dataset, it will auto-load at launch:\n",
"\n",
"\t$ python karoo_gp_main.py /[path]/[to_your]/[filename].csv\n",
"\t\n",
"\n",
"If you include one or more additional arguments, they will override the default values, as follows:\n",
"\n",
"\t-ker [r,c,m]\t\t\tfitness function: (r)egression, (c)lassification, or (m)atching\n",
"\t-typ [f,g,r]\t\t\tTree type: (f)ull, (g)row, or (r)amped half/half\n",
"\t-bas [3...10]\t\t\tmaximum Tree depth for initial population\n",
"\t-max [3...10]\t\t\tmaximum Tree depth for entire run\n",
"\t-min [3 to 2^(bas +1) - 1]\tminimum number of nodes\n",
"\t-pop [10...1000]\t\tnumber of trees in each generational population\n",
"\t-gen [1...100]\t\t\tnumber of generations\n",
"\t-tor [7 per 100]\t\tnumber of trees selected for tournament\n",
"\t-evr [0.0...1.0] \t\tdecimal percent of pop generated through Reproduction\n",
"\t-evp [0.0...1.0] \t\tdecimal percent of pop generated through Point Mutation\n",
"\t-evb [0.0...1.0] \t\tdecimal percent of pop generated through Branch Mutation\n",
"\t-evc [0.0...1.0] \t\tdecimal percent of pop generated through Crossover\n",
"\t\n",
"If you include any of the above flags, then you *must* also include a flag to load an external dataset.\n",
"\n",
"\t-fil [path]/[to]/[data].csv\tan external dataset\n",
"\n",
"\n",
"An example is given, as follows:\n",
"\n",
"\t$ python karoo_gp_server.py -ker c -typ r -bas 4 -fil [path]/[to]/[data].csv\n",
"\n",
"'''\n",
"\n",
"import os\n",
"import sys; sys.path.append('modules/') # add directory 'modules' to the current path\n",
"import argparse\n",
"import karoo_gp_base_class; gp = karoo_gp_base_class.Base_GP()\n",
"\n",
"os.system('clear')\n",
"print ('\\n\\033[36m\\033[1m')\n",
"print ('\\t ** ** ****** ***** ****** ****** ****** ******')\n",
"print ('\\t ** ** ** ** ** ** ** ** ** ** ** ** **')\n",
"print ('\\t ** ** ** ** ** ** ** ** ** ** ** ** **')\n",
"print ('\\t **** ******** ****** ** ** ** ** ** *** *******')\n",
"print ('\\t ** ** ** ** ** ** ** ** ** ** ** ** **')\n",
"print ('\\t ** ** ** ** ** ** ** ** ** ** ** ** **')\n",
"print ('\\t ** ** ** ** ** ** ** ** ** ** ** ** **')\n",
"print ('\\t ** ** ** ** ** ** ****** ****** ****** **')\n",
"print ('\\033[0;0m')\n",
"print ('\\t\\033[36m Genetic Programming in Python with TensorFlow - by Kai Staats, version 2.3\\033[0;0m')\n",
"print ('')\n",
"\n",
"\n",
"#++++++++++++++++++++++++++++++++++++++++++\n",
"# User Interface for Configuation |\n",
"#++++++++++++++++++++++++++++++++++++++++++\n",
"\n",
"if len(sys.argv) < 3: # either no command line argument, or only a filename is provided\n",
"\n",
"\twhile True:\n",
"\t\ttry:\n",
"\t\t\tquery = input('\\t Select (c)lassification, (r)egression, (m)atching, or (p)lay (default m): ')\n",
"\t\t\tif query in ['c','r','m','p','']: kernel = query or 'm'; break\n",
"\t\t\telse: raise ValueError()\n",
"\t\texcept ValueError: print ('\\t\\033[32m Select from the options given. Try again ...\\n\\033[0;0m')\n",
"\t\texcept KeyboardInterrupt: sys.exit()\n",
"\t\t\n",
"\tif kernel == 'p': # play mode\n",
"\t\twhile True:\n",
"\t\t\ttry:\n",
"\t\t\t\tquery = input('\\t Select (f)ull or (g)row (default g): ')\n",
"\t\t\t\tif query in ['f','g','']: tree_type = query or 'f'; break\n",
"\t\t\t\telse: raise ValueError()\n",
"\t\t\texcept ValueError: print ('\\t\\033[32m Select from the options given. Try again ...\\n\\033[0;0m')\n",
"\t\t\texcept KeyboardInterrupt: sys.exit()\n",
"\t\t\t\n",
"\t\twhile True:\n",
"\t\t\ttry:\n",
"\t\t\t\tquery = input('\\t Enter the depth of the Tree (default 1): ')\n",
"\t\t\t\tif query == '': tree_depth_base = 1; break\n",
"\t\t\t\telif int(query) in list(range(1,11)): tree_depth_base = int(query); break\n",
"\t\t\t\telse: raise ValueError()\n",
"\t\t\texcept ValueError: print ('\\t\\033[32m Enter a number from 1 including 10. Try again ...\\n\\033[0;0m')\n",
"\t\t\texcept KeyboardInterrupt: sys.exit()\n",
"\t\t\t\n",
"\t\ttree_depth_max = tree_depth_base\n",
"\t\ttree_depth_min = 3\n",
"\t\ttree_pop_max = 1\n",
"\t\tgen_max = 1\n",
"\t\ttourn_size = 0\n",
"\t\tdisplay = 'm'\n",
"\t\t#\tevolve_repro, evolve_point, evolve_branch, evolve_cross, tourn_size, precision, filename are not required\n",
"\t\n",
"\telse: # if any other kernel is selected\n",
"\t\t\n",
"\t\twhile True:\n",
"\t\t\ttry:\n",
"\t\t\t\tquery = input('\\t Select (f)ull, (g)row, or (r)amped 50/50 method (default r): ')\n",
"\t\t\t\tif query in ['f','g','r','']: tree_type = query or 'r'; break\n",
"\t\t\t\telse: raise ValueError()\n",
"\t\t\texcept ValueError: print ('\\t\\033[32m Select from the options given. Try again ...\\n\\033[0;0m')\n",
"\t\t\texcept KeyboardInterrupt: sys.exit()\n",
"\t\t\t\n",
"\t\twhile True:\n",
"\t\t\ttry:\n",
"\t\t\t\tquery = input('\\t Enter depth of the \\033[3minitial\\033[0;0m population of Trees (default 3): ')\n",
"\t\t\t\tif query == '': tree_depth_base = 3; break\n",
"\t\t\t\telif int(query) in list(range(1,11)): tree_depth_base = int(query); break\n",
"\t\t\t\telse: raise ValueError()\n",
"\t\t\texcept ValueError: print ('\\t\\033[32m Enter a number from 1 including 10. Try again ...\\n\\033[0;0m')\n",
"\t\t\texcept KeyboardInterrupt: sys.exit()\n",
"\t\t\t\n",
"\t\twhile True:\n",
"\t\t\ttry:\n",
"\t\t\t\tquery = input('\\t Enter maximum Tree depth (default %s): ' %str(tree_depth_base))\n",
"\t\t\t\tif query == '': tree_depth_max = tree_depth_base; break\n",
"\t\t\t\telif int(query) in list(range(tree_depth_base,11)): tree_depth_max = int(query); break\n",
"\t\t\t\telse: raise ValueError()\n",
"\t\t\texcept ValueError: print ('\\t\\033[32m Enter a number from %s including 10. Try again ...\\n\\033[0;0m' %str(tree_depth_base))\n",
"\t\t\texcept KeyboardInterrupt: sys.exit()\n",
"\t\t\t\n",
"\t\tmax_nodes = 2**(tree_depth_base+1)-1 # calc the max number of nodes for the given depth\n",
"\t\t\n",
"\t\twhile True:\n",
"\t\t\ttry:\n",
"\t\t\t\tquery = input('\\t Enter minimum number of nodes for any given Tree (default 3; max %s): ' %str(max_nodes))\n",
"\t\t\t\tif query == '': tree_depth_min = 3; break\n",
"\t\t\t\telif int(query) in list(range(3,max_nodes + 1)): tree_depth_min = int(query); break\n",
"\t\t\t\telse: raise ValueError()\n",
"\t\t\texcept ValueError: print ('\\t\\033[32m Enter a number from 3 including %s. Try again ...\\n\\033[0;0m' %str(max_nodes))\n",
"\t\t\texcept KeyboardInterrupt: sys.exit()\n",
"\t\t\t\n",
"\t\t#while True:\n",
"\t\t\t#try:\n",
"\t\t\t\t#query = input('\\t Select (p)artial or (f)ull operator inclusion (default p): ')\n",
"\t\t\t\t#if query == '': swim = 'p'; break\n",
"\t\t\t\t#elif query in ['p','f']: swim = query; break\n",
"\t\t\t\t#else: raise ValueError()\n",
"\t\t\t#except ValueError: print ('\\t\\033[32m Select from the options given. Try again ...\\n\\033[0;0m')\n",
"\t\t\t#except KeyboardInterrupt: sys.exit()\n",
"\t\t\t\n",
"\t\twhile True:\n",
"\t\t\ttry:\n",
"\t\t\t\tquery = input('\\t Enter number of Trees in each population (default 100): ')\n",
"\t\t\t\tif query == '': tree_pop_max = 100; break\n",
"\t\t\t\telif int(query) in list(range(1,1001)): tree_pop_max = int(query); break\n",
"\t\t\t\telse: raise ValueError()\n",
"\t\t\texcept ValueError: print ('\\t\\033[32m Enter a number from 1 including 1000. Try again ...\\n\\033[0;0m')\n",
"\t\t\texcept KeyboardInterrupt: sys.exit()\n",
"\t\t\t\n",
"\t\t# calculate the tournament size\n",
"\t\ttourn_size = int(tree_pop_max * 0.07) # default 7% can be changed by selecting (g)eneration and then 'ts'\n",
"\t\tif tourn_size < 2: tourn_size = 2 # forces some diversity for small populations\n",
"\t\tif tree_pop_max == 1: tourn_size = 1 # in theory, supports the evolution of a single Tree - NEED TO FIX 2018 04/19\n",
"\t\t\n",
"\t\twhile True:\n",
"\t\t\ttry:\n",
"\t\t\t\tquery = input('\\t Enter max number of generations (default 10): ')\n",
"\t\t\t\tif query == '': gen_max = 10; break\n",
"\t\t\t\telif int(query) in list(range(1,101)): gen_max = int(query); break\n",
"\t\t\t\telse: raise ValueError()\n",
"\t\t\texcept ValueError: print ('\\t\\033[32m Enter a number from 1 including 100. Try again ...\\n\\033[0;0m')\n",
"\t\t\texcept KeyboardInterrupt: sys.exit()\n",
"\t\t\t\n",
"\t\tif gen_max > 1:\n",
"\t\t\twhile True:\n",
"\t\t\t\ttry:\n",
"\t\t\t\t\tquery = input('\\t Display (i)nteractive, (g)eneration, (m)iminal, (s)ilent, or (d)e(b)ug (default m): ')\n",
"\t\t\t\t\tif query in ['i','g','m','s','db','']: display = query or 'm'; break\n",
"\t\t\t\t\telse: raise ValueError()\n",
"\t\t\t\texcept ValueError: print ('\\t\\033[32m Select from the options given. Try again ...\\n\\033[0;0m')\n",
"\t\t\t\texcept KeyboardInterrupt: sys.exit()\n",
"\t\t\t\t\n",
"\t\telse: display = 's' # display mode is not used, but a value must be passed\n",
"\t\t\t\t\n",
"\t### additional configuration parameters ###\n",
"\t\n",
"\tevolve_repro = int(0.1 * tree_pop_max) # quantity of a population generated through Reproduction\n",
"\tevolve_point = int(0.0 * tree_pop_max) # quantity of a population generated through Point Mutation\n",
"\tevolve_branch = int(0.2 * tree_pop_max) # quantity of a population generated through Branch Mutation\n",
"\tevolve_cross = int(0.7 * tree_pop_max) # quantity of a population generated through Crossover\n",
"\tfilename = '' # not required unless an external file is referenced\n",
"\tprecision = 6 # number of floating points for the round function in 'fx_fitness_eval'\n",
"\tswim = 'p' # require (p)artial or (f)ull set of features (operators) for each Tree entering the gene_pool\n",
"\tmode = 'd' # pause at the (d)esktop when complete, awaiting further user interaction; or terminate in (s)erver mode\n",
"\t\n",
"\n",
"#++++++++++++++++++++++++++++++++++++++++++\n",
"# Command Line for Configuation |\n",
"#++++++++++++++++++++++++++++++++++++++++++\n",
"\n",
"else: # 2 or more command line arguments are provided\n",
"\n",
"\tap = argparse.ArgumentParser(description = 'Karoo GP Server')\n",
"\tap.add_argument('-ker', action = 'store', dest = 'kernel', default = 'c', help = '[c,r,m] fitness function: (r)egression, (c)lassification, or (m)atching')\n",
"\tap.add_argument('-typ', action = 'store', dest = 'type', default = 'r', help = '[f,g,r] Tree type: (f)ull, (g)row, or (r)amped half/half')\n",
"\tap.add_argument('-bas', action = 'store', dest = 'depth_base', default = 4, help = '[3...10] maximum Tree depth for the initial population')\n",
"\tap.add_argument('-max', action = 'store', dest = 'depth_max', default = 4, help = '[3...10] maximum Tree depth for the entire run')\n",
"\tap.add_argument('-min', action = 'store', dest = 'depth_min', default = 3, help = 'minimum nodes, from 3 to 2^(base_depth +1) - 1')\n",
"\tap.add_argument('-pop', action = 'store', dest = 'pop_max', default = 100, help = '[10...1000] number of trees per generation')\n",
"\tap.add_argument('-gen', action = 'store', dest = 'gen_max', default = 10, help = '[1...100] number of generations')\n",
"\tap.add_argument('-tor', action = 'store', dest = 'tor_size', default = 7, help = '[7 for each 100] recommended tournament size')\n",
"\tap.add_argument('-evr', action = 'store', dest = 'evo_r', default = 0.1, help = '[0.0-1.0] decimal percent of pop generated through Reproduction')\n",
"\tap.add_argument('-evp', action = 'store', dest = 'evo_p', default = 0.0, help = '[0.0-1.0] decimal percent of pop generated through Point Mutation')\n",
"\tap.add_argument('-evb', action = 'store', dest = 'evo_b', default = 0.2, help = '[0.0-1.0] decimal percent of pop generated through Branch Mutation')\n",
"\tap.add_argument('-evc', action = 'store', dest = 'evo_c', default = 0.7, help = '[0.0-1.0] decimal percent of pop generated through Crossover')\n",
"\tap.add_argument('-fil', action = 'store', dest = 'filename', default = '', help = '/path/to_your/[data].csv')\n",
"\t\n",
"\targs = ap.parse_args()\n",
"\n",
"\t# pass the argparse defaults and/or user inputs to the required variables\n",
"\tkernel = str(args.kernel)\n",
"\ttree_type = str(args.type)\n",
"\ttree_depth_base = int(args.depth_base)\n",
"\ttree_depth_max = int(args.depth_max)\n",
"\ttree_depth_min = int(args.depth_min)\n",
"\ttree_pop_max = int(args.pop_max)\n",
"\tgen_max = int(args.gen_max)\n",
"\ttourn_size = int(args.tor_size)\n",
"\tevolve_repro = int(float(args.evo_r) * tree_pop_max)\n",
"\tevolve_point = int(float(args.evo_p) * tree_pop_max)\n",
"\tevolve_branch = int(float(args.evo_b) * tree_pop_max)\n",
"\tevolve_cross = int(float(args.evo_c) * tree_pop_max)\n",
"\t#filename = str(args.filename)\n",
"\tfilename='files/data_MATCH.csv'\n",
"\tdisplay = 's' # display mode is set to (s)ilent\n",
"\tprecision = 6 # number of floating points for the round function in 'fx_fitness_eval'\n",
"\tswim = 'p' # require (p)artial or (f)ull set of features (operators) for each Tree entering the gene_pool\n",
"\tmode = 's' # pause at the (d)esktop when complete, awaiting further user interaction; or terminate in (s)erver mode\n",
"\t"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"#++++++++++++++++++++++++++++++++++++++++++\n",
"# Conduct the GP run |\n",
"#++++++++++++++++++++++++++++++++++++++++++\n",
"\n",
"gp.fx_karoo_gp(kernel, tree_type, tree_depth_base, tree_depth_max, tree_depth_min, tree_pop_max, gen_max, tourn_size, filename, evolve_repro, evolve_point, evolve_branch, evolve_cross, display, precision, swim, mode)"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.3"
}
},
"nbformat": 4,
"nbformat_minor": 4
}

View File

@ -56,8 +56,8 @@ operators = {ast.Add: tf.add, # e.g., a + b
'square': tf.square, # e.g., square(a)
'sqrt': tf.sqrt, # e.g., sqrt(a)
'pow': tf.pow, # e.g., pow(a, b)
'log': tf.log, # e.g., log(a)
'log1p': tf.log1p, # e.g., log1p(a)
'log': tf.math.log, # e.g., log(a)
'log1p': tf.math.log1p, # e.g., log1p(a)
'cos': tf.cos, # e.g., cos(a)
'sin': tf.sin, # e.g., sin(a)
'tan': tf.tan, # e.g., tan(a)
@ -174,12 +174,12 @@ class Base_GP(object):
if self.kernel == 'p': # terminate here for Play mode
self.fx_display_tree(self.tree) # print the current Tree
self.fx_data_tree_write(self.population_a, 'a') # save this one Tree to disk
sys.exit()
#sys.exit()
elif self.gen_max == 1: # terminate here if constructing just one generation
self.fx_data_tree_write(self.population_a, 'a') # save this single population to disk
print ('\n We have constructed a single, stochastic population of', self.tree_pop_max,'Trees, and saved to disk')
sys.exit()
#sys.exit()
else: print ('\n We have constructed the first, stochastic population of', self.tree_pop_max,'Trees')
@ -342,7 +342,7 @@ class Base_GP(object):
print ('\n\033[3m "It is not the strongest of the species that survive, nor the most intelligent,\033[0;0m')
print ('\033[3m but the one most responsive to change."\033[0;0m --Charles Darwin\n')
print ('\033[3m Congrats!\033[0;0m Your Karoo GP run is complete.\n')
sys.exit()
#sys.exit()
return
@ -1228,11 +1228,11 @@ class Base_GP(object):
'''
# Initialize TensorFlow session
tf.reset_default_graph() # Reset TF internal state and cache (after previous processing)
config = tf.ConfigProto(log_device_placement=self.tf_device_log, allow_soft_placement=True)
tf.compat.v1.reset_default_graph() # Reset TF internal state and cache (after previous processing)
config = tf.compat.v1.ConfigProto(log_device_placement=self.tf_device_log, allow_soft_placement=True)
config.gpu_options.allow_growth = True
with tf.Session(config=config) as sess:
with tf.compat.v1.Session(config=config) as sess:
with sess.graph.device(self.tf_device):
# 1 - Load data into TF vectors
@ -1314,7 +1314,7 @@ class Base_GP(object):
else: raise Exception('Kernel type is wrong or missing. You entered {}'.format(self.kernel))
fitness = tf.reduce_sum(pairwise_fitness)
fitness = tf.reduce_sum(input_tensor=pairwise_fitness)
# Process TF graph and collect the results
result, pred_labels, solution, fitness, pairwise_fitness = sess.run([result, pred_labels, solution, fitness, pairwise_fitness])
@ -1433,9 +1433,9 @@ class Base_GP(object):
for class_label in range(self.class_labels - 2, 0, -1):
cond = (class_label - 1 - skew < result) & (result <= class_label - skew)
label_rules[class_label] = tf.cond(cond, lambda: (tf.constant(class_label), tf.constant(' <= {}'.format(class_label - skew))), lambda: label_rules[class_label + 1])
label_rules[class_label] = tf.cond(pred=cond, true_fn=lambda: (tf.constant(class_label), tf.constant(' <= {}'.format(class_label - skew))), false_fn=lambda: label_rules[class_label + 1])
pred_label = tf.cond(result <= 0 - skew, lambda: (tf.constant(0), tf.constant(' <= {}'.format(0 - skew))), lambda: label_rules[1])
pred_label = tf.cond(pred=result <= 0 - skew, true_fn=lambda: (tf.constant(0), tf.constant(' <= {}'.format(0 - skew))), false_fn=lambda: label_rules[1])
return pred_label

4
requirements.txt 100644
View File

@ -0,0 +1,4 @@
wheel
sklearn
sympy
tensorflow

24
setup.py 100644
View File

@ -0,0 +1,24 @@
import setuptools
with open("README.md", "r") as fh:
long_description = fh.read()
setuptools.setup(
name="karoo_gp-kstaats", # Replace with your own username
version="2.3",
author="Kai Staats",
author_email="pip@overthesun.com",
description="evolutionary algorithm, genetic programming suite",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://github.com/kstaats/karoo_gp",
packages={''},
classifiers=[
"Programming Language :: Python :: 3",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
],
python_requires='>=3.6',
)
# packages=setuptools.find_packages(),