Saturday, February 4, 2023

Remote connection to a Jupyter notebook

 This is explained here

https://docs.anaconda.com/anaconda/user-guide/tasks/remote-jupyter-notebook/


Follow the following steps to use Jupyter Notebook launched from remote server.


Launch Jupyter Notebook from remote server, selecting a port number for <PORT>:


# Replace <PORT> with your selected port number

jupyter notebook --no-browser --port=<PORT>

For example, if you want to use port number 8080, you would run the following:


jupyter notebook --no-browser --port=8080

Or run the following command to launch with default port:


jupyter notebook --no-browser

Please note the port setting. You will need it in the next step.


You can access the notebook from your remote machine over SSH by setting up a SSH tunnel. Run the following command from your local machine:


# Replace <PORT> with the port number you selected in the above step

# Replace <REMOTE_USER> with the remote server username

# Replace <REMOTE_HOST> with your remote server address

ssh -L 8080:localhost:<PORT> <REMOTE_USER>@<REMOTE_HOST>

The above command opens up a new SSH session in the terminal.


Open a browser from your local machine and navigate to http://localhost:8080/, the Jupyter Notebook web interface. Replace 8080 with your port number used in step 1.


---------------------------------------------------

If you want to use jupyter lab instead of notebook and connect remotely from visual studio you can follow these steps.


1. First start a jupyter lab server on the remote server:

jupyter lab --no-browser --NotebookApp.allow_origin='*' --port=8080


2. Now from visual studio connect to this server using the Jupyter extension

https://stackoverflow.com/questions/72121390/how-to-use-jupyterlab-in-visual-studio-code

you might have to install the extension first.

Tuesday, March 10, 2020

Monday, April 3, 2017

Language Modelling Datasets and Tools

An introduction to language modelling (LM) is here
For RNN LMs see Mikolov's slides here
Mikolov's RNN LM tool kit  is here
SRI international's language modelling tool kit is here

Language modelling can be slow with RNNs. This is a faster implementation that uses Eigen.

There is a 1B token benchmark dataset released by Google for evaluating language models. It can be obtained here It has tokenised and splitted heldout and train portions. Shard 0000 is used for reporting test results.


Thursday, May 5, 2016

MongoDB cheat sheet

To run mongodb do
>mongod
This will start the deamon.

Then to query the DB start the client by
>mongo
 To use a DB do
use

To see the databases do
show databases

You can see the collections in a database after opening that database by
show collections

Collections correspond to Tables in mysql.

db.collection.find({key:value})

can be used to query the DB. If you want to perform "like" queries do
db.collection.find({"my key": /term/})

Do not use quotes around term in the previous example to perform like queries. If you do, then it will be an exact match.

This will not work within pymongo and you need to use $regex directive for this.
db["mycollection"].find({"key":{"$regex":"%s" % query}})

You can use count do count the number of matching records.
db.collection.count({key:value})



Friday, March 4, 2016

Bibdesk annotations in TeX preview

If you would like to display the annotations you do in bibdesk in the TeX preview then you can do so as follows.

The annotations you make are kept in a bibtex field called "Annote".

We need a bibtex style file (bst) that supports this annote field.
You can copy paste the plainannotate.bst file at the end of this post.

Next, create the following directory
~/Library/texmf/bibtex/bst
and put the style file there.

Now from Bibdesk set the preferences as follows


That is it



Tuesday, March 24, 2015

texshop 日本語入力

texshopで背景色を黒にしている場合,日本語入力がまだ未確定の場合,入力している文字が見えなくなるという問題がある.これは良くするためには次のコマンドを使えば良い.

defaults write TeXShop ResetSourceTextColorEachTime YES

しかし,これでバグがでることもあるようなので注意.

参照:
http://pages.uoregon.edu/koch/texshop/version.html

Continuously monitor GPU usage

 For nvidia GPUs do the follwing: nvidia-smi -l 1