File size: 1,507 Bytes
c4bb1a4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2a62664
c4bb1a4
eff5645
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c4bb1a4
eea6acb
c4bb1a4
 
 
 
eea6acb
c4bb1a4
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
---
tags:
- tabular-classification
- sklearn
dataset:
- titanic
widget:
  structuredData:
    PassengerId:
      - 1191
    Pclass:
      - 1
    Name:
      - Sherlock Holmes
    Sex:
      - male
    SibSp:
      - 0
    Parch:
      - 0
    Ticket:
      - C.A.29395
    Fare:
      - 12
    Cabin:
      - F44
    Embarked:
      - S
---

## Titanic (Survived/Not Survived) - Binary Classification

### How to use

```python
from huggingface_hub import hf_hub_url, cached_download
import joblib
import pandas as pd
import numpy as np
from tensorflow.keras.models import load_model

REPO_ID = 'danupurnomo/dummy-titanic'
PIPELINE_FILENAME = 'final_pipeline.pkl'
TF_FILENAME = 'titanic_model.h5'

model_pipeline = joblib.load(cached_download(
    hf_hub_url(REPO_ID, PIPELINE_FILENAME)
))

model_seq = load_model(cached_download(
    hf_hub_url(REPO_ID, TF_FILENAME)
))
```

### Example A New Data
```python
new_data = {
    'PassengerId': 1191,
    'Pclass': 1, 
    'Name': 'Sherlock Holmes', 
    'Sex': 'male', 
    'Age': 30, 
    'SibSp': 0,
    'Parch': 0, 
    'Ticket': 'C.A.29395', 
    'Fare': 12, 
    'Cabin': 'F44', 
    'Embarked': 'S'
}
new_data = pd.DataFrame([new_data])
```

### Transform Inference-Set
```python
new_data_transform = model_pipeline.transform(new_data)
```

### Predict using Neural Networks
```python
y_pred_inf_single = model_seq.predict(new_data_transform)
y_pred_inf_single = np.where(y_pred_inf_single >= 0.5, 1, 0)
print('Result : ', y_pred_inf_single)
# [[0]]
```