Emacs haskell-mode

I set about installing emacs and its haskell-mode based on emacs-haskell-tutorial. But this is what worked for me. This is only part of the process and I will add any new information I come across.

Eval: (find-file user-init-file)

Press Enter. It loads my .emacs file if it is there.Create one and save it if it isn’t there.

 

This section should be enough if there is no proxy.

(require ‘package)
(add-to-list ‘package-archives
‘(“melpa-stable” . “http://stable.melpa.org/packages/”) t)
(package-initialize)

Corporate proxy

If there is a proxy add this section too.

(setq url-proxy-services
‘((“no_proxy” . “^\\(localhost\\|10.*\\)”)
(“http” . “proxy.cognizant.com:6050”)
(“https” . “proxy.cognizant.com:6050”)))

(setq url-http-proxy-basic-auth-storage
(list (list “proxy.cognizant.com:6050”
(cons “Credentials !”
(base64-encode-string “user:password”)))))

cabal

Cabal seems to be the package manager for Haskell libraries.

I had cabal.exe locally and realized it is not the correct way.

D:\Frege>..\Downloads\cabal update
Downloading the latest package list from hackage.haskell.org
Note: there is a new version of cabal-install available.
To upgrade, run: cabal install cabal-install

D:\Frege>ghc-pkg list Cabal
D:/Haskell Platform/7.10.3\lib\package.conf.d:
Cabal-1.22.5.0

D:\Frege>..\Downloads\cabal install happy
cabal: The program ghc version >=6.4 is required but it could not be found.

So I installed Haskell Platform and added the bin folder to the PATH.

Everything worked after that.

D:\Frege>cabal install cabal-install
Resolving dependencies…
Downloading cabal-install-1.22.9.0…
Configuring cabal-install-1.22.9.0…
Building cabal-install-1.22.9.0…
Linking dist\build\cabal\cabal.exe …
Installing executable(s) in C:\Users\476458\AppData\Roaming\cabal\bin
Installed cabal-install-1.22.9.0

D:\Frege>cabal
cabal: no command given (try –help)

D:\Frege>cabal –version
cabal-install version 1.22.6.0
using version 1.22.5.0 of the Cabal library

D:\Frege>ls
HaskellPlatform-7.10.3-x86_64-setup.exe InstallCert.java
InstallCert$SavingTrustManager.class emacs-24.5-bin-i686-mingw32
InstallCert.class emacs-24.5-bin-i686-mingw32.zip

D:\Frege>cabal update
Downloading the latest package list from hackage.haskell.org

D:\Frege>cabal install happy
Resolving dependencies…
Downloading happy-1.19.5…
[1 of 1] Compiling Main ( C:\Users\476458\AppData\Local\Temp\cabal-t
Linking dist\build\happy\happy.exe …
Installing executable(s) in C:\Users\476458\AppData\Roaming\cabal\bin
Installed happy-1.19.5

Haskell-mode

Not yet sure if the mode is effective. I don’t see any syntax highlighting yet.

haskell-mode

Update : Now it looks better

proper-haskell-mode

Practicing Predictive Analytics using “R”

I spent a Sunday on this code to answer some questions for a Coursera course. At this time this code is the norm in more than one such course. So I am just  building muscle memory. I type this code and look at the result and learn what I learnt earlier.

If I don’t remember how to solve it I search but the point is that I have to be constantly in touch with “R” as well the fundamentals. My day job doesn’t let me do this. The other option is a book on Machine Learning like the one by Tom Mitchell but that takes foreover.

setwd("~/Documents/PredictiveAnalytics")

library(dplyr)  
library(ggplot2)
library(rpart)
library(tree)
library(randomForest)
library(e1071)
library(caret)


seaflow <- read.csv(file="seaflow_21min.csv",head=TRUE)
final <-filter(seaflow, pop == "synecho")
print(nrow(final))
print( summary(seaflow))


print ( nrow(seaflow))

print( head(seaflow))

set.seed(555)
trainIndex <- createDataPartition( seaflow$file_id, p = 0.5, list=FALSE, times=1)
train <- seaflow[ trainIndex,]
test <- seaflow[ -trainIndex,]



print(mean(train$time))

p <- ggplot( seaflow, aes( pe, chl_small, color = pop)) + geom_point()
dev.new(width=15, height=14)
print(p)
ggsave("~/predictiveanalytics.png", width=4, height=4, dpi=100)
fol <- formula(pop ~ fsc_small + fsc_perp + fsc_big + pe + chl_big + chl_small)
model <- rpart(fol, method="class", data=train)
print(model)
#plot(model)
#text(model, use.n = TRUE, all=TRUE, cex=0.9)

testprediction <- predict( model, newdata=test, type="class")
comparisonofpredictions <- testprediction == test$pop
accuracy <- sum(comparisonofpredictions) / length(comparisonofpredictions)

print( accuracy )

randomforestmodel <- randomForest( fol, data = train)
print(randomforestmodel)

testpredictionusingrandomforest <- predict( randomforestmodel, newdata=test, type="class")
comparisonofpredictions <- testpredictionusingrandomforest == test$pop
accuracy <- sum(comparisonofpredictions) / length(comparisonofpredictions)
print( accuracy )

print(importance(randomforestmodel))

svmmodel <- svm( fol, data = train)

testpredictionusingsvm <- predict( svmmodel, newdata=test, type="class")
comparisonofpredictions <- testpredictionusingsvm == test$pop
accuracy <- sum(comparisonofpredictions) / length(comparisonofpredictions)
print( accuracy )

predictiveanalytics

StatET for R

I have probably done this a hundred times but still recording these steps is useful.
So apart from installation of R and Eclipse and the StatEt plugin these are the other steps to use R in eclipse.

PATH
C:\Program Files\Java\jdk1.7.0_75\jre\bin

JAVA_HOME
C:\Program Files\Java\jdk1.7.0_75

> install.packages(c(“rj”, “rj.gd”), repos=”http://download.walware.de/rj-2.0&#8243;)
trying URL ‘http://download.walware.de/rj-2.0/bin/windows/contrib/3.2/rj_2.0.4-2
.zip’
Content type ‘application/zip’ length 378433 bytes (369 KB)
downloaded 369 KB

trying URL ‘http://download.walware.de/rj-2.0/bin/windows/contrib/3.2/rj.gd_2.0.
0-1.zip’
Content type ‘application/zip’ length 93519 bytes (91 KB)
downloaded 91 KB

package ‘rj’ successfully unpacked and MD5 sums checked
package ‘rj.gd’ successfully unpacked and MD5 sums checked

The downloaded binary packages are in
C:\Users\476458\AppData\Local\Temp\RtmpCsk978\downloaded_packages

Cubie Board 4 unboxing

As part of my effort to code OCaml and create a Unikernel I was introduced to https://mirage.io/ and that in turn lead me to http://cubieboard.org/. I still don’t know if Cubie board 4 can be used to do what they have done using the earlier versions of the hardware. I have to try

Microsoft: Data Science and Machine Learning Essentials

edxAfter completing this edX course successfully I identified these questions which I answered wrongly. In some cases I selected more than the required options due to oversight.

I have marked the likely answers.

I need a longer article to explain what I learnt which I plan to write soon.

You have amassed a large volume of customer data, and want to determine if it is possible to identify distinct categories of customer based on similar characteristics.

What kind of predictive model should you create?

    1. Regression
    2. Clustering
    3. Recommender
    4. Classification

You discover that there are missing values for an unordered numeric column in your data.
Which three approaches can you consider using to treat the missing values?

    1. Substitute the text “None”.
    2. Forward fill or back fill the value.
    3. Remove rows in which the value is missing.
    4. Interpolate a value to replace the missing value.
    5. Substitute the numeral 0

When assessing the residuals of a regression model you observe the following:

Residuals exhibit a persistent structure and are not randomly distributed with respect to values of the label or the features.
The Q-Q normal plots of the residuals show significant curvature and the presence of outliers.
Given these results, which two of the following things should you try to improve the model?

    1. Cross validate the model to ensure that it will generalize properly.
    2. Try a different class of regression model that might better fit the problem should be tried.
    3. Create some engineered features with behaviors more closely tracking the values of the label.
    4. Add a Sweep Parameters module with the Metric for measuring performance for classification property set to Accuracy.

You create an experiment that uses a Train Matchbox Recommender module to train a recommendation model, and add a Score Matchbox Recommender module to generate a prediction. You want to use the model in a music streaming service to recommend songs for the currently logged in user.Which recommender prediction kind should you configure the Score Matchbox

Recommender module to use?

    1. Item Recommendation
    2. Related Items
    3. Rating Prediction
    4. Related Users

While exploring a dataset you discover a nonlinear relationship between certain features and the label.Which two of the following feature engineering steps should you try before training a supervised machine learning model?


1. Ensure the features are linearly independent.
2. Compute new features based on polynomial values of the original features.
3. Compute mathematical combinations of the label and other features.
4. Compute new features based on logarithms or exponentiation of these original features.

Which two of the following approaches can you use to determine which features to prune in an Azure ML experiment?

    1. Use the Permutation Feature Importance model to identify features of near-zero importance.
    2. Use the Cross Validation module to identify folds which indicate the model does not generalize well.
    3. Prune features one at a time to find features which reduce model performance or have no impact on model performance as measured with the Evaluate Model module.
    4. Use the Split module to create training, test and evaluation data sub-sets to evaluate model performance.

Gradient Descent

I ported the Gradient Descent code from Octave to Python. The base Octave code is the one from Andrew Ng’s Machine Learning MOOC.

I mistakenly believed that the Octave code for matrix multiplication will directly translate in Python.

The matrices are these.
Screen Shot 2015-10-25 at 9.27.09 pm

But the Octave code is this

Octave code

  theta = theta - ( (  alpha * ( (( theta' * X' )' - y)' * X ))/length(y) )'

and the Python code is this.

Python

def gradientDescent( X,
                     y,
                     theta,
                     alpha = 0.01,
                     num_iters = 1500):

    r,c = X.shape
    
    for iter in range( 1, num_iters ):
        theta = theta - ( ( alpha * np.dot( X.T, ( np.dot( X , theta ).T - np.asarray(y) ).T ) ) / r )
    return theta

This line is not a direct transalation.

        theta = theta - ( ( alpha * np.dot( X.T, ( np.dot( X , theta ).T - np.asarray(y) ).T ) ) / r )

But only the above Python code gives me the correct theta that matches the value given by the Octave code.

Screen Shot 2015-10-25 at 9.32.53 pm

Linear Regression

gradientdescent

But the gradient descent also does not give me the correct value after a certain number of iterations. But the cost value is similar.

Gradient Descent from Octave Code that converges

Octave-Contour

Minimization of cost

Initial cost is 640.125590
J = 656.25
Initial cost is 656.250475
J = 672.58
Initial cost is 672.583001
J = 689.12
Initial cost is 689.123170
J = 705.87
Initial cost is 705.870980
J = 722.83
Initial cost is 722.826433
J = 739.99
Initial cost is 739.989527

Gradient Descent from my Python Code that does not converge to the optimal value

gradientdescent1

Minimization of cost

635.81837438
651.963633303
668.316534159
684.877076945
701.645261664
718.621088313
735.804556895

Azure Machine Learning

The AzureML Studio user interface is slick, very responsive and adopts a workflow supporting both R and Python scripts. There is a free account available with this caveat but that did not hamper my efforts to test some simple flows.

Note: Your free-tier Azure ML account allows you unlimited access, with some reduced capabilities compared to a full Microsoft Azure subscription. Your experiments will only run at low priority on a single processor core. As a result, you will experience some longer wait times. However, you have full access to all features of Azure ML.

The graph visualizations are very spiffy too. I am yet to finish the data cleansing aspects and use the really interesting ML algorithms.

Azure