Skip to content

Commit 26209a4

Browse files
authored
Define git attributes for renormalization. (dmlc#8921)
1 parent a2cdba5 commit 26209a4

File tree

10 files changed

+1618
-1600
lines changed

10 files changed

+1618
-1600
lines changed

.gitattributes

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
* text=auto
2+
3+
*.c text eol=lf
4+
*.h text eol=lf
5+
*.cc text eol=lf
6+
*.cuh text eol=lf
7+
*.cu text eol=lf
8+
*.py text eol=lf
9+
*.txt text eol=lf
10+
*.R text eol=lf
11+
*.scala text eol=lf
12+
*.java text eol=lf
13+
14+
*.sh text eol=lf
15+
16+
*.rst text eol=lf
17+
*.md text eol=lf
18+
*.csv text eol=lf
Lines changed: 30 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -1,30 +1,30 @@
1-
XGBoost4J Code Examples
2-
=======================
3-
4-
## Java API
5-
* [Basic walkthrough of wrappers](src/main/java/ml/dmlc/xgboost4j/java/example/BasicWalkThrough.java)
6-
* [Customize loss function, and evaluation metric](src/main/java/ml/dmlc/xgboost4j/java/example/CustomObjective.java)
7-
* [Boosting from existing prediction](src/main/java/ml/dmlc/xgboost4j/java/example/BoostFromPrediction.java)
8-
* [Predicting using first n trees](src/main/java/ml/dmlc/xgboost4j/java/example/PredictFirstNtree.java)
9-
* [Generalized Linear Model](src/main/java/ml/dmlc/xgboost4j/java/example/GeneralizedLinearModel.java)
10-
* [Cross validation](src/main/java/ml/dmlc/xgboost4j/java/example/CrossValidation.java)
11-
* [Predicting leaf indices](src/main/java/ml/dmlc/xgboost4j/java/example/PredictLeafIndices.java)
12-
* [External Memory](src/main/java/ml/dmlc/xgboost4j/java/example/ExternalMemory.java)
13-
* [Early Stopping](src/main/java/ml/dmlc/xgboost4j/java/example/EarlyStopping.java)
14-
15-
## Scala API
16-
17-
* [Basic walkthrough of wrappers](src/main/scala/ml/dmlc/xgboost4j/scala/example/BasicWalkThrough.scala)
18-
* [Customize loss function, and evaluation metric](src/main/scala/ml/dmlc/xgboost4j/scala/example/CustomObjective.scala)
19-
* [Boosting from existing prediction](src/main/scala/ml/dmlc/xgboost4j/scala/example/BoostFromPrediction.scala)
20-
* [Predicting using first n trees](src/main/scala/ml/dmlc/xgboost4j/scala/example/PredictFirstNTree.scala)
21-
* [Generalized Linear Model](src/main/scala/ml/dmlc/xgboost4j/scala/example/GeneralizedLinearModel.scala)
22-
* [Cross validation](src/main/scala/ml/dmlc/xgboost4j/scala/example/CrossValidation.scala)
23-
* [Predicting leaf indices](src/main/scala/ml/dmlc/xgboost4j/scala/example/PredictLeafIndices.scala)
24-
* [External Memory](src/main/scala/ml/dmlc/xgboost4j/scala/example/ExternalMemory.scala)
25-
26-
## Spark API
27-
* [Distributed Training with Spark](src/main/scala/ml/dmlc/xgboost4j/scala/example/spark/SparkMLlibPipeline.scala)
28-
29-
## Flink API
30-
* [Distributed Training with Flink](src/main/scala/ml/dmlc/xgboost4j/scala/example/flink/DistTrainWithFlink.scala)
1+
XGBoost4J Code Examples
2+
=======================
3+
4+
## Java API
5+
* [Basic walkthrough of wrappers](src/main/java/ml/dmlc/xgboost4j/java/example/BasicWalkThrough.java)
6+
* [Customize loss function, and evaluation metric](src/main/java/ml/dmlc/xgboost4j/java/example/CustomObjective.java)
7+
* [Boosting from existing prediction](src/main/java/ml/dmlc/xgboost4j/java/example/BoostFromPrediction.java)
8+
* [Predicting using first n trees](src/main/java/ml/dmlc/xgboost4j/java/example/PredictFirstNtree.java)
9+
* [Generalized Linear Model](src/main/java/ml/dmlc/xgboost4j/java/example/GeneralizedLinearModel.java)
10+
* [Cross validation](src/main/java/ml/dmlc/xgboost4j/java/example/CrossValidation.java)
11+
* [Predicting leaf indices](src/main/java/ml/dmlc/xgboost4j/java/example/PredictLeafIndices.java)
12+
* [External Memory](src/main/java/ml/dmlc/xgboost4j/java/example/ExternalMemory.java)
13+
* [Early Stopping](src/main/java/ml/dmlc/xgboost4j/java/example/EarlyStopping.java)
14+
15+
## Scala API
16+
17+
* [Basic walkthrough of wrappers](src/main/scala/ml/dmlc/xgboost4j/scala/example/BasicWalkThrough.scala)
18+
* [Customize loss function, and evaluation metric](src/main/scala/ml/dmlc/xgboost4j/scala/example/CustomObjective.scala)
19+
* [Boosting from existing prediction](src/main/scala/ml/dmlc/xgboost4j/scala/example/BoostFromPrediction.scala)
20+
* [Predicting using first n trees](src/main/scala/ml/dmlc/xgboost4j/scala/example/PredictFirstNTree.scala)
21+
* [Generalized Linear Model](src/main/scala/ml/dmlc/xgboost4j/scala/example/GeneralizedLinearModel.scala)
22+
* [Cross validation](src/main/scala/ml/dmlc/xgboost4j/scala/example/CrossValidation.scala)
23+
* [Predicting leaf indices](src/main/scala/ml/dmlc/xgboost4j/scala/example/PredictLeafIndices.scala)
24+
* [External Memory](src/main/scala/ml/dmlc/xgboost4j/scala/example/ExternalMemory.scala)
25+
26+
## Spark API
27+
* [Distributed Training with Spark](src/main/scala/ml/dmlc/xgboost4j/scala/example/spark/SparkMLlibPipeline.scala)
28+
29+
## Flink API
30+
* [Distributed Training with Flink](src/main/scala/ml/dmlc/xgboost4j/scala/example/flink/DistTrainWithFlink.scala)
Lines changed: 66 additions & 66 deletions
Original file line numberDiff line numberDiff line change
@@ -1,66 +1,66 @@
1-
0,10.0229017899,7.30178495562,0.118115020017,1
2-
0,9.93639621859,9.93102159291,0.0435030004396,1
3-
0,10.1301737265,0.00411765220572,2.4165878053,1
4-
1,9.87828587087,0.608588414992,0.111262590883,1
5-
0,10.1373430048,0.47764012225,0.991553052194,1
6-
0,10.0523814718,4.72152505167,0.672978832666,1
7-
0,10.0449715742,8.40373928536,0.384457573667,1
8-
1,996.398498791,941.976309154,0.230269231292,2
9-
0,1005.11269468,900.093680877,0.265031528873,2
10-
0,997.160349441,891.331101688,2.19362017313,2
11-
0,993.754139031,44.8000165317,1.03868009875,2
12-
1,994.831299184,241.959208453,0.667631827024,2
13-
0,995.948333283,7.94326917112,0.750490877118,3
14-
0,989.733981273,7.52077625436,0.0126335967282,3
15-
0,1003.54086516,6.48177510564,1.19441696788,3
16-
0,996.56177804,9.71959812613,1.33082465111,3
17-
0,1005.61382467,0.234339369309,1.17987797356,3
18-
1,980.215758708,6.85554542926,2.63965085259,3
19-
1,987.776408872,2.23354609991,0.841885278028,3
20-
0,1006.54260396,8.12142049834,2.26639471174,3
21-
0,1009.87927639,6.40028519044,0.775155669615,3
22-
0,9.95006244393,928.76896718,234.948458244,4
23-
1,10.0749152258,255.294574476,62.9728604166,4
24-
1,10.1916541988,312.682867085,92.299413677,4
25-
0,9.95646724484,742.263188416,53.3310473654,4
26-
0,9.86211293222,996.237023866,2.00760301168,4
27-
1,9.91801019468,303.971783709,50.3147230679,4
28-
0,996.983996934,9.52188222766,1.33588120981,5
29-
0,995.704388126,9.49260524915,0.908498516541,5
30-
0,987.86480767,0.0870786716821,0.108859297837,5
31-
0,1000.99561307,2.85272694575,0.171134518956,5
32-
0,1011.05508066,7.55336771768,1.04950084825,5
33-
1,985.52199365,0.763305780608,1.7402424375,5
34-
0,10.0430321467,813.185427181,4.97728254185,6
35-
0,10.0812334228,258.297288417,0.127477670549,6
36-
0,9.84210504292,887.205815261,0.991689193955,6
37-
1,9.94625332613,0.298622762132,0.147881353231,6
38-
0,9.97800659954,727.619819757,0.0718361141866,6
39-
1,9.8037938472,957.385549617,0.0618862028941,6
40-
0,10.0880634741,185.024638577,1.7028095095,6
41-
0,9.98630799154,109.10631473,0.681117359751,6
42-
0,9.91671416638,166.248076588,122.538291094,7
43-
0,10.1206910464,88.1539468531,141.189859069,7
44-
1,10.1767160518,1.02960996847,172.02256237,7
45-
0,9.93025147233,391.196641942,58.040338247,7
46-
0,9.84850936037,474.63346537,17.5627875397,7
47-
1,9.8162731343,61.9199554213,30.6740972851,7
48-
0,10.0403482984,987.50416929,73.0472906209,7
49-
1,997.019228359,133.294717663,0.0572254083186,8
50-
0,973.303999107,1.79080888849,0.100478717048,8
51-
0,1008.28808825,342.282350685,0.409806485495,8
52-
0,1014.55621524,0.680510407082,0.929530602495,8
53-
1,1012.74370325,823.105266455,0.0894693730585,8
54-
0,1003.63554038,727.334432075,0.58206275756,8
55-
0,10.1560432436,740.35938307,11.6823378533,9
56-
0,9.83949099701,512.828227154,138.206666681,9
57-
1,10.1837395682,179.287126088,185.479062365,9
58-
1,9.9761881495,12.1093388336,9.1264604171,9
59-
1,9.77402180766,318.561317743,80.6005221355,9
60-
0,1011.15705381,0.215825852155,1.34429667906,10
61-
0,1005.60353229,727.202346126,1.47146041005,10
62-
1,1013.93702961,58.7312725205,0.421041560754,10
63-
0,1004.86813074,757.693204258,0.566055205344,10
64-
0,999.996324692,813.12386828,0.864428279513,10
65-
0,996.55255931,918.760056995,0.43365051974,10
66-
1,1004.1394132,464.371823646,0.312492288321,10
1+
0,10.0229017899,7.30178495562,0.118115020017,1
2+
0,9.93639621859,9.93102159291,0.0435030004396,1
3+
0,10.1301737265,0.00411765220572,2.4165878053,1
4+
1,9.87828587087,0.608588414992,0.111262590883,1
5+
0,10.1373430048,0.47764012225,0.991553052194,1
6+
0,10.0523814718,4.72152505167,0.672978832666,1
7+
0,10.0449715742,8.40373928536,0.384457573667,1
8+
1,996.398498791,941.976309154,0.230269231292,2
9+
0,1005.11269468,900.093680877,0.265031528873,2
10+
0,997.160349441,891.331101688,2.19362017313,2
11+
0,993.754139031,44.8000165317,1.03868009875,2
12+
1,994.831299184,241.959208453,0.667631827024,2
13+
0,995.948333283,7.94326917112,0.750490877118,3
14+
0,989.733981273,7.52077625436,0.0126335967282,3
15+
0,1003.54086516,6.48177510564,1.19441696788,3
16+
0,996.56177804,9.71959812613,1.33082465111,3
17+
0,1005.61382467,0.234339369309,1.17987797356,3
18+
1,980.215758708,6.85554542926,2.63965085259,3
19+
1,987.776408872,2.23354609991,0.841885278028,3
20+
0,1006.54260396,8.12142049834,2.26639471174,3
21+
0,1009.87927639,6.40028519044,0.775155669615,3
22+
0,9.95006244393,928.76896718,234.948458244,4
23+
1,10.0749152258,255.294574476,62.9728604166,4
24+
1,10.1916541988,312.682867085,92.299413677,4
25+
0,9.95646724484,742.263188416,53.3310473654,4
26+
0,9.86211293222,996.237023866,2.00760301168,4
27+
1,9.91801019468,303.971783709,50.3147230679,4
28+
0,996.983996934,9.52188222766,1.33588120981,5
29+
0,995.704388126,9.49260524915,0.908498516541,5
30+
0,987.86480767,0.0870786716821,0.108859297837,5
31+
0,1000.99561307,2.85272694575,0.171134518956,5
32+
0,1011.05508066,7.55336771768,1.04950084825,5
33+
1,985.52199365,0.763305780608,1.7402424375,5
34+
0,10.0430321467,813.185427181,4.97728254185,6
35+
0,10.0812334228,258.297288417,0.127477670549,6
36+
0,9.84210504292,887.205815261,0.991689193955,6
37+
1,9.94625332613,0.298622762132,0.147881353231,6
38+
0,9.97800659954,727.619819757,0.0718361141866,6
39+
1,9.8037938472,957.385549617,0.0618862028941,6
40+
0,10.0880634741,185.024638577,1.7028095095,6
41+
0,9.98630799154,109.10631473,0.681117359751,6
42+
0,9.91671416638,166.248076588,122.538291094,7
43+
0,10.1206910464,88.1539468531,141.189859069,7
44+
1,10.1767160518,1.02960996847,172.02256237,7
45+
0,9.93025147233,391.196641942,58.040338247,7
46+
0,9.84850936037,474.63346537,17.5627875397,7
47+
1,9.8162731343,61.9199554213,30.6740972851,7
48+
0,10.0403482984,987.50416929,73.0472906209,7
49+
1,997.019228359,133.294717663,0.0572254083186,8
50+
0,973.303999107,1.79080888849,0.100478717048,8
51+
0,1008.28808825,342.282350685,0.409806485495,8
52+
0,1014.55621524,0.680510407082,0.929530602495,8
53+
1,1012.74370325,823.105266455,0.0894693730585,8
54+
0,1003.63554038,727.334432075,0.58206275756,8
55+
0,10.1560432436,740.35938307,11.6823378533,9
56+
0,9.83949099701,512.828227154,138.206666681,9
57+
1,10.1837395682,179.287126088,185.479062365,9
58+
1,9.9761881495,12.1093388336,9.1264604171,9
59+
1,9.77402180766,318.561317743,80.6005221355,9
60+
0,1011.15705381,0.215825852155,1.34429667906,10
61+
0,1005.60353229,727.202346126,1.47146041005,10
62+
1,1013.93702961,58.7312725205,0.421041560754,10
63+
0,1004.86813074,757.693204258,0.566055205344,10
64+
0,999.996324692,813.12386828,0.864428279513,10
65+
0,996.55255931,918.760056995,0.43365051974,10
66+
1,1004.1394132,464.371823646,0.312492288321,10

0 commit comments

Comments
 (0)