Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertModel: ['cls.predictions.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.LayerNorm.bias', 'cls.seq_relationship.bias', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.dense.weight', 'cls.predictions.transform.LayerNorm.weight']
- This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
- This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
Data read successfully
started running:
val_data size =  20
train_data size =  1780
Valiation embeddings calculated
Train embeddings calculated
/home/kiranpurohit/miniconda3/envs/arm/lib/python3.9/site-packages/sklearn/cluster/_kmeans.py:1412: FutureWarning: The default value of `n_init` will change from 10 to 'auto' in 1.4. Set the value of `n_init` explicitly to suppress the warning
  super()._check_params_vs_input(X, default_n_init=10)

predicting:   0%|          | 0/10 [00:00<?, ?it/s]
predicting:  10%|█         | 1/10 [01:05<09:53, 65.91s/it]
predicting:  20%|██        | 2/10 [02:01<08:00, 60.08s/it]
predicting:  30%|███       | 3/10 [03:11<07:30, 64.37s/it]
predicting:  40%|████      | 4/10 [04:10<06:13, 62.26s/it]
predicting:  50%|█████     | 5/10 [05:20<05:24, 64.93s/it]
predicting:  60%|██████    | 6/10 [06:20<04:13, 63.34s/it]
predicting:  70%|███████   | 7/10 [07:33<03:19, 66.42s/it]
predicting:  80%|████████  | 8/10 [08:33<02:09, 64.62s/it]
predicting:  90%|█████████ | 9/10 [09:35<01:03, 63.75s/it]
predicting: 100%|██████████| 10/10 [10:43<00:00, 64.99s/it]
predicting: 100%|██████████| 10/10 [10:43<00:00, 64.35s/it]

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [01:05<04:21, 65.41s/it]
predicting:  40%|████      | 2/5 [02:23<03:37, 72.59s/it]
predicting:  60%|██████    | 3/5 [03:18<02:09, 64.89s/it]
predicting:  80%|████████  | 4/5 [04:35<01:09, 69.43s/it]
predicting: 100%|██████████| 5/5 [05:37<00:00, 66.72s/it]
predicting: 100%|██████████| 5/5 [05:37<00:00, 67.41s/it]

********* LLM LOSS ON U FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.15, 0.1, 0.2, 0.15, 0.15, 0.2, 0.15, 0.2, 0.15, 0.1]]
AVG_LLM_loss_on_VAL_data  [0.15499999999999997]
MIN_LLM_loss_on_VAL_data  [0.1]
MAX_LLM_loss_on_VAL_data  [0.2]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.2, 0.15, 0.15, 0.15, 0.15]]
AVG_LLM_loss_on_VAL_data  [0.16]
MIN_LLM_loss_on_VAL_data  [0.15]
MAX_LLM_loss_on_VAL_data  [0.2]

overlaps  [[0, 0, 0, 0, 0, 1, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0]]

AVG_overlap  [0.04444444444444444]
MIN_overlap  [0.0]
MAX_overlap  [0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]

alpha shape  (1780,)

alpha  [ 8.32667268e-17 -2.44249065e-14 -2.30926389e-14 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.29460647  0.26154552  0.15166271  0.06404772  0.27632569  0.26214027
  0.0662502   0.36358289 -0.11946059  0.0464381   0.01854094  0.04855094
  0.18748532  0.23275562  0.09336656  0.14861169 -0.07188396  0.26429701
  0.25695328  0.02928896 -0.02425904 -0.03669311  0.03753481 -0.04579261
  0.1755991   0.11111902  0.12441864  0.09863437  0.04408134 -0.00319092
  0.10785693  0.14378612  0.15423443  0.22149296  0.18858163  0.24732777
  0.22918505  0.03903969  0.11364125  0.03638176  0.02176673  0.23598889
  0.16948635  0.12248069  0.20584531  0.08887238  0.37736907  0.21741253
  0.09477465  0.3437111   0.04582831  0.16983141  0.15278947  0.13584087
  0.25545272  0.38161395  0.39906355  0.16437464  0.33252611 -0.01288413
  0.0829126   0.11487526  0.1539195   0.17328597  0.23189118  0.10922744
  0.20559451  0.16060984  0.16865488  0.09776865  0.15114297  0.07092137
  0.16946339  0.23196934  0.09621194  0.14616045  0.05782314  0.19887827
  0.16726362  0.19072881  0.06688485  0.16907676  0.26326138 -0.13421286
 -0.1438237   0.09592797  0.11473727  0.49659146 -0.21825259 -0.29328978
  0.24483943  0.31037272  0.15512905  0.23838402  0.0052727   0.94296408
  0.05530935  0.12731877  0.49425189  0.01364925 -0.13931926 -0.04365052
  0.04413684  0.15553945  0.59433572  0.34540291  0.16596669  0.05193822
  0.38569279 -0.00559306  0.14610908  0.34030997 -0.03494586  0.11750617
  0.09332366  0.77539562  0.40580988 -0.07045674  0.54327488  0.19745645
  0.19993214  0.13238042  0.09029695  0.15918017  0.04425801  0.21528888
  0.12638944  0.12742227  0.16109531  0.11095623  0.20162665  0.13362489
  0.1379822   0.19320376  0.21949194  0.14285242  0.2348726   0.0165588
  0.20327844  0.08712908  0.2599169   0.26482987  0.15237018 -0.07610492
  0.32128748  0.36857636  0.15807833  0.57481906  0.11623678  0.07355014
  0.31163284 -0.03141206  0.2228866   0.19222232 -0.38986728  0.29224754
  0.25772187  0.06850257  0.60738344  0.15623396  0.31872549  0.13011788
  0.38980602  0.39711702  0.10857213  0.15323948  0.02470441  0.37981967
 -0.10668708 -0.12461268  0.01400579  0.21250035  0.21723868  0.14090335
 -0.06274862  0.35744731  0.04840233 -0.00507772  0.19521858  0.11260778
 -0.0200434   0.3007929   0.30578132 -0.1079817   0.13011963 -0.09287441
  0.05613602  0.10553568  0.12183788  0.24498373  0.2314615  -0.1211509
 -0.09439778  0.1757196  -0.03659035  0.37529507  0.13831771 -0.13810038
  0.33895714 -0.01354848]

approx error on U on val data  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.23631374392237953, 0.24953571779089506, 0.27812678886179165, 0.231729063686265, 0.18605605800596603]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.24388856  0.27213478  0.07234203  0.24093345  0.16012335  0.17980979
 -0.04572754  0.37154459  0.3566826   0.17590773  0.2067586   0.17899704
  0.01171382  0.32880251  0.08251693  0.19044212  0.22701157  0.30370319
  0.31201952  0.11744877  0.244597    0.2438112   0.12407034  0.24354973
  0.24162471  0.25969066  0.17583864  0.34431378  0.24559344  0.30331461
  0.03346893 -0.05167983 -0.07732801  0.07754612 -0.09303824  0.09500232
  0.04609315  0.08370172  0.45207961 -0.15110684  0.23591066  0.19186738
 -0.00271636 -0.00313564  0.25652739  0.26742253  0.38607681  0.40061012
  0.03983689  0.28734999 -0.10700802  0.18493973 -0.18389187 -0.09906707
  0.0593296   0.45873597  0.14979805  0.03701621  0.15913171  0.20638945
  0.12753548  0.12151821  0.10932383  0.15769412  0.20742612  0.13621344
  0.13921045  0.15817476  0.18078446  0.11218243  0.14158234  0.14887209
  0.12442868  0.12927777  0.13669158  0.14132383  0.16534766  0.13670896
  0.13519609  0.19552306 -0.04416396  0.22088944  0.10023838 -0.05190484
  0.06731157  0.33296414  0.42074982  0.48719535  0.20122601  0.20263447
  0.04127338 -0.10469716  0.02816082  0.25374262 -0.18238243  0.12457505
  0.23099551  0.03546139  0.4768922   0.07191378]

approx error on V on Val data  [[0.3005124651610089, 0.24023287340261276, 0.23399029277961225, 0.25178130044886216, 0.22510235601642412]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [01:09<00:00, 69.13s/it]
predicting: 100%|██████████| 1/1 [01:09<00:00, 69.13s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:55<03:42, 55.55s/it]
predicting:  40%|████      | 2/5 [02:07<03:15, 65.10s/it]
predicting:  60%|██████    | 3/5 [03:08<02:06, 63.18s/it]
predicting:  80%|████████  | 4/5 [04:19<01:06, 66.48s/it]
predicting: 100%|██████████| 5/5 [05:20<00:00, 64.51s/it]
predicting: 100%|██████████| 5/5 [05:20<00:00, 64.16s/it]

***********************************
S_worst_ind  5

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

AVG_LLM_loss_on_VAL_data  [0.15499999999999997, 0.15]

MIN_LLM_loss_on_VAL_data  [0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.2]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.2, 0.15, 0.15, 0.15, 0.15], [0.15, 0.25, 0.15, 0.15, 0.25]]

AVG_LLM_loss_on_VAL_data  [0.16, 0.19]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.2, 0.25]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.94606475e-01  2.61545522e-01  1.51662712e-01  6.40477201e-02
  2.76325687e-01  2.62140267e-01  6.62502007e-02  3.63582888e-01
 -1.19460594e-01  4.64381020e-02  1.85409382e-02  4.85509389e-02
  1.87485315e-01  2.32755619e-01  9.33665571e-02  1.48611690e-01
 -7.18839629e-02  2.64297012e-01  2.56953275e-01  2.92889556e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  2.17667257e-02  2.35988888e-01  1.69486346e-01  1.22480693e-01
  2.05845307e-01  8.88723787e-02  3.77369074e-01  2.17412530e-01
  9.47746482e-02  3.43711100e-01  4.58283116e-02  1.69831406e-01
  1.52789471e-01  1.35840869e-01  2.55452715e-01  3.81613955e-01
  3.99063546e-01  1.64374640e-01  3.32526109e-01 -1.28841342e-02
  8.29125998e-02  1.14875258e-01  1.53919505e-01  1.73285971e-01
  2.31891181e-01  1.09227436e-01  2.05594510e-01  1.60609841e-01
  1.68654882e-01  9.77686463e-02  1.51142967e-01  7.09213734e-02
  1.69463395e-01  2.31969338e-01  9.62119382e-02  1.46160454e-01
  5.78231352e-02  1.98878265e-01  1.67263619e-01  1.90728809e-01
  6.68848479e-02  1.69076763e-01  2.63261380e-01 -1.34212859e-01
 -1.43823703e-01  9.59279705e-02  1.14737274e-01  4.96591460e-01
 -2.18252586e-01 -2.93289775e-01  2.44839428e-01  3.10372718e-01
  1.55129046e-01  2.38384017e-01  5.27269979e-03  9.42964078e-01
  5.53093472e-02  1.27318774e-01  4.94251886e-01  1.36492500e-02
  1.99932140e-01  1.32380422e-01  9.02969476e-02  1.59180175e-01
  4.42580077e-02  2.15288882e-01  1.26389442e-01  1.27422268e-01
  1.61095309e-01  1.10956233e-01  2.01626655e-01  1.33624890e-01
  1.37982195e-01  1.93203762e-01  2.19491936e-01  1.42852423e-01
  2.34872603e-01  1.65588035e-02  2.03278438e-01  8.71290829e-02
  2.59916901e-01  2.64829866e-01  1.52370179e-01 -7.61049227e-02
  3.21287477e-01  3.68576357e-01  1.58078332e-01  5.74819056e-01
  1.16236779e-01  7.35501367e-02  3.11632841e-01 -3.14120642e-02
  2.22886601e-01  1.92222323e-01 -3.89867282e-01  2.92247538e-01
  2.57721868e-01  6.85025712e-02  6.07383441e-01  1.56233959e-01
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -2.00434036e-02  3.00792896e-01  3.05781320e-01 -1.07981702e-01
  1.30119634e-01 -9.28744066e-02  5.61360173e-02  1.05535682e-01
  1.21837878e-01  2.44983732e-01  2.31461499e-01 -1.21150900e-01
 -9.43977808e-02  1.75719601e-01 -3.65903482e-02  3.75295067e-01
  1.38317710e-01 -1.38100375e-01  3.38957135e-01 -1.35484756e-02
 -3.28070740e+00 -3.42991173e+00 -3.42420540e+00 -3.45896505e+00
 -3.29314399e+00 -3.23076975e+00 -3.15025804e+00 -3.13371592e+00
 -3.08615583e+00 -3.21389092e+00 -3.27734391e+00 -3.31705953e+00
 -3.39376599e+00 -3.30161237e+00 -3.55729176e+00 -3.01471206e+00
 -3.46873408e+00 -3.11900607e+00 -3.12434069e+00 -2.96463308e+00]

approx error on U for Validation Data after updating U  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.24953571779089506, 0.27812678886179165, 0.23172906368626495, 0.18605605800596603, 3.4120111785147245]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.23591066  0.19186738 -0.00271636 -0.00313564  0.25652739  0.26742253
  0.38607681  0.40061012  0.03983689  0.28734999 -0.10700802  0.18493973
 -0.18389187 -0.09906707  0.0593296   0.45873597  0.14979805  0.03701621
  0.15913171  0.20638945  0.24388856  0.27213478  0.07234203  0.24093345
  0.16012335  0.17980979 -0.04572754  0.37154459  0.3566826   0.17590773
  0.2067586   0.17899704  0.01171382  0.32880251  0.08251693  0.19044212
  0.22701157  0.30370319  0.31201952  0.11744877  1.13679373  1.16154372
  1.21952842  1.10089     1.14154842  1.08080851  1.22849285  1.15812401
  1.06711572  1.10928183  1.06043322  1.11318937  1.16785509  1.13999817
  1.12885917  1.02953844  1.20958443  1.06476325  1.15465443  1.02831273
  1.11152363  1.07202591  0.9793857   0.99959329  0.93456756  1.03885093
  0.97277164  1.04034898  0.86540629  1.05914044  0.89237209  1.0321424
  0.90443808  0.94266427  1.05598434  0.90180893  1.05534998  0.89774997
  0.9576291   0.77879302 -0.13931926 -0.04365052  0.04413684  0.15553945
  0.59433572  0.34540291  0.16596669  0.05193822  0.38569279 -0.00559306
  0.14610908  0.34030997 -0.03494586  0.11750617  0.09332366  0.77539562
  0.40580988 -0.07045674  0.54327488  0.19745645]

approx error on V for Validation Data after updating V  [[0.23399029277961225, 0.3176322143187879, 0.9750657754550233, 0.8386835253310766, 0.2517734533343107]]

overlaps  [[0, 0, 0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 1], [0, 0, 0, 0, 0, 0, 0, 0, 1]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668]
MIN_overlap  [0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]

alpha shape  (1780,)

alpha  [-2.10942375e-15  3.60822483e-15  3.55271368e-15 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.94639028e-01  2.61513208e-01  1.51664388e-01  6.40768062e-02
  2.76326904e-01  2.62147741e-01  6.62516853e-02  3.63621255e-01
 -1.19445038e-01  4.65115540e-02  1.85184953e-02  4.85897522e-02
  1.87433318e-01  2.32731686e-01  9.33624135e-02  1.48595674e-01
 -7.19051433e-02  2.64289738e-01  2.56880608e-01  2.92946313e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  2.17667257e-02  2.35988888e-01  1.69486346e-01  1.22480693e-01
  2.05845307e-01  8.88723787e-02  3.77369074e-01  2.17412530e-01
  9.47746482e-02  3.43711100e-01  4.58283116e-02  1.69831406e-01
  1.52789471e-01  1.35840869e-01  2.55452715e-01  3.81613955e-01
  3.99063546e-01  1.64374640e-01  3.32526109e-01 -1.28841342e-02
  8.29125998e-02  1.14875258e-01  1.53919505e-01  1.73285971e-01
  2.31891181e-01  1.09227436e-01  2.05594510e-01  1.60609841e-01
  1.68654882e-01  9.77686463e-02  1.51142967e-01  7.09213734e-02
  1.69463395e-01  2.31969338e-01  9.62119382e-02  1.46160454e-01
  5.78231352e-02  1.98878265e-01  1.67263619e-01  1.90728809e-01
  6.68848479e-02  1.69076763e-01  2.63261380e-01 -1.34212859e-01
 -1.43823703e-01  9.59279705e-02  1.14737274e-01  4.96591460e-01
 -2.18252586e-01 -2.93289775e-01  2.44839428e-01  3.10372718e-01
  1.55129046e-01  2.38384017e-01  5.27269979e-03  9.42964078e-01
  5.53093472e-02  1.27318774e-01  4.94251886e-01  1.36492500e-02
  2.18431896e-01  1.23118561e-01  7.39857366e-02  1.62271357e-01
  2.02300365e-02  2.18282623e-01  1.25732661e-01  1.35514912e-01
  1.56017717e-01  1.24537761e-01  2.04370955e-01  1.33785628e-01
  1.31901294e-01  2.14588152e-01  2.30977484e-01  1.40928622e-01
  2.47084483e-01  8.64868734e-04  2.07635408e-01  6.61524035e-02
  2.59916901e-01  2.64829866e-01  1.52370179e-01 -7.61049227e-02
  3.21287477e-01  3.68576357e-01  1.58078332e-01  5.74819056e-01
  1.16236779e-01  7.35501367e-02  3.11632841e-01 -3.14120642e-02
  2.22886601e-01  1.92222323e-01 -3.89867282e-01  2.92247538e-01
  2.57721868e-01  6.85025712e-02  6.07383441e-01  1.56233959e-01
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02]

approx error on U on val data  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.23631374392237953, 0.24953571779089506, 0.27812678886179165, 0.231729063686265, 0.18605605800596603], [0.2359801995957806, 0.17454631451517882, 0.28965629609284577, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626528, 0.18521048307998722, 0.2418033779932797]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
  3.41174882e-01  4.27342825e-01  9.60558839e-02  2.90891459e-01
  1.87647794e-01  1.43162203e-01 -1.93328239e-01  4.77217533e-01
  3.42479558e-01  3.58951543e-01  1.72823565e-01  1.84886145e-01
  8.99032858e-02  6.07586023e-01  3.88476546e-02  2.60105419e-01
  2.54445159e-01  4.90434536e-01  4.37961350e-01 -5.61842522e-02
 -2.06257653e-02  2.32841078e-01  1.00175377e-01  1.78154803e-02
  8.39878967e-02  1.02023138e-01  2.45930545e-01  2.67743003e-01
  1.98680866e-01  6.45206939e-03  1.86675166e-01  4.81147166e-02
  6.62610557e-02  1.20837788e-03  4.86321117e-03  6.08700303e-02
  4.05460305e-01  1.03996699e-01  5.06547748e-01  3.22387819e-01
  6.12385393e-02  2.32137165e-01  1.98688789e-01  1.84139459e-01
  1.16989540e-01  3.26926623e-01  3.91505268e-01  3.30080123e-01
  3.36053285e-01  6.67340547e-02 -2.46642291e-02 -1.50550206e-01
  1.57224091e-02 -1.18392278e-01  1.24946606e-02  3.60293611e-01
  2.44397853e-01 -2.51420642e-01  5.39348613e-01  6.03289277e-02
 -1.25130395e-01  1.37663914e-01  6.17736090e-02  2.26247796e-01
  6.65207092e-01  7.21089861e-01  2.78961289e-01  1.58502421e-01
  6.37836773e-01 -1.28755991e-02 -1.51177373e-02  4.51700161e-01
 -2.40283018e-02 -1.84183037e-02 -4.27463084e-02  8.18167145e-01
  3.09414697e-01 -4.63432849e-02  6.68470689e-01  2.19699952e-01]

approx error on V on Val data  [[0.3005124651610089, 0.24023287340261276, 0.23399029277961225, 0.25178130044886216, 0.22510235601642412], [0.23279130874330747, 0.3255196536083326, 0.21561693928679454, 0.2281330790423822, 0.22882604559648972]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [01:11<00:00, 71.72s/it]
predicting: 100%|██████████| 1/1 [01:11<00:00, 71.72s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [00:54<03:36, 54.16s/it]
predicting:  40%|████      | 2/5 [02:01<03:05, 61.74s/it]
predicting:  60%|██████    | 3/5 [03:05<02:06, 63.13s/it]
predicting:  80%|████████  | 4/5 [04:09<01:03, 63.19s/it]
predicting: 100%|██████████| 5/5 [05:15<00:00, 64.45s/it]
predicting: 100%|██████████| 5/5 [05:15<00:00, 63.19s/it]

***********************************
S_worst_ind  2

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1]

AVG_LLM_loss_on_VAL_data  [0.15499999999999997, 0.15, 0.15499999999999997]

MIN_LLM_loss_on_VAL_data  [0.1, 0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.2, 0.25]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.2, 0.15, 0.15, 0.15, 0.15], [0.15, 0.25, 0.15, 0.15, 0.25], [0.15, 0.15, 0.25, 0.25, 0.15]]

AVG_LLM_loss_on_VAL_data  [0.16, 0.19, 0.19]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.2, 0.25, 0.25]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1]

approximation 
 [ 2.94639028e-01  2.61513208e-01  1.51664388e-01  6.40768062e-02
  2.76326904e-01  2.62147741e-01  6.62516853e-02  3.63621255e-01
 -1.19445038e-01  4.65115540e-02  1.85184953e-02  4.85897522e-02
  1.87433318e-01  2.32731686e-01  9.33624135e-02  1.48595674e-01
 -7.19051433e-02  2.64289738e-01  2.56880608e-01  2.92946313e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  8.29125998e-02  1.14875258e-01  1.53919505e-01  1.73285971e-01
  2.31891181e-01  1.09227436e-01  2.05594510e-01  1.60609841e-01
  1.68654882e-01  9.77686463e-02  1.51142967e-01  7.09213734e-02
  1.69463395e-01  2.31969338e-01  9.62119382e-02  1.46160454e-01
  5.78231352e-02  1.98878265e-01  1.67263619e-01  1.90728809e-01
  6.68848479e-02  1.69076763e-01  2.63261380e-01 -1.34212859e-01
 -1.43823703e-01  9.59279705e-02  1.14737274e-01  4.96591460e-01
 -2.18252586e-01 -2.93289775e-01  2.44839428e-01  3.10372718e-01
  1.55129046e-01  2.38384017e-01  5.27269979e-03  9.42964078e-01
  5.53093472e-02  1.27318774e-01  4.94251886e-01  1.36492500e-02
  2.18431896e-01  1.23118561e-01  7.39857366e-02  1.62271357e-01
  2.02300365e-02  2.18282623e-01  1.25732661e-01  1.35514912e-01
  1.56017717e-01  1.24537761e-01  2.04370955e-01  1.33785628e-01
  1.31901294e-01  2.14588152e-01  2.30977484e-01  1.40928622e-01
  2.47084483e-01  8.64868734e-04  2.07635408e-01  6.61524035e-02
  2.59916901e-01  2.64829866e-01  1.52370179e-01 -7.61049227e-02
  3.21287477e-01  3.68576357e-01  1.58078332e-01  5.74819056e-01
  1.16236779e-01  7.35501367e-02  3.11632841e-01 -3.14120642e-02
  2.22886601e-01  1.92222323e-01 -3.89867282e-01  2.92247538e-01
  2.57721868e-01  6.85025712e-02  6.07383441e-01  1.56233959e-01
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02
 -2.30607364e+00 -2.35904844e+00 -2.36922311e+00 -2.15663517e+00
 -2.11007008e+00 -2.09864429e+00 -2.17969707e+00 -2.21220909e+00
 -1.89687276e+00 -2.31641126e+00 -1.93286577e+00 -2.22948706e+00
 -2.40952906e+00 -2.29119473e+00 -2.22823479e+00 -2.09119346e+00
 -2.31069294e+00 -2.07687899e+00 -2.08071882e+00 -1.65494936e+00]

approx error on U for Validation Data after updating U  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.24953571779089506, 0.27812678886179165, 0.23172906368626495, 0.18605605800596603, 3.4120111785147245], [0.2359801995957806, 0.17454631451517882, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626522, 0.18521048307998722, 0.24180337799327972, 2.4155314944857715]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
  2.17667257e-02  2.35988888e-01  1.69486346e-01  1.22480693e-01
  2.05845307e-01  8.88723787e-02  3.77369074e-01  2.17412530e-01
  9.47746482e-02  3.43711100e-01  4.58283116e-02  1.69831406e-01
  1.52789471e-01  1.35840869e-01  2.55452715e-01  3.81613955e-01
  3.99063546e-01  1.64374640e-01  3.32526109e-01 -1.28841342e-02
  3.41174882e-01  4.27342825e-01  9.60558839e-02  2.90891459e-01
  1.87647794e-01  1.43162203e-01 -1.93328239e-01  4.77217533e-01
  3.42479558e-01  3.58951543e-01  1.72823565e-01  1.84886145e-01
  8.99032858e-02  6.07586023e-01  3.88476546e-02  2.60105419e-01
  2.54445159e-01  4.90434536e-01  4.37961350e-01 -5.61842522e-02
 -1.25130395e-01  1.37663914e-01  6.17736090e-02  2.26247796e-01
  6.65207092e-01  7.21089861e-01  2.78961289e-01  1.58502421e-01
  6.37836773e-01 -1.28755991e-02 -1.51177373e-02  4.51700161e-01
 -2.40283018e-02 -1.84183037e-02 -4.27463084e-02  8.18167145e-01
  3.09414697e-01 -4.63432849e-02  6.68470689e-01  2.19699952e-01
  3.12153100e+00  3.18949221e+00  3.34871284e+00  3.02294267e+00
  3.13458693e+00  2.96780074e+00  3.37332835e+00  3.18010199e+00
  2.93020160e+00  3.04598586e+00  2.91185206e+00  3.05671559e+00
  3.20682264e+00  3.13033010e+00  3.09974342e+00  2.82701786e+00
  3.32140757e+00  2.92374194e+00  3.17057486e+00  2.82365216e+00]

approx error on V for Validation Data after updating V  [[0.23399029277961225, 0.3176322143187879, 0.9750657754550233, 0.8386835253310766, 0.2517734533343107], [0.23279130874330747, 0.2532403830116759, 0.3255196536083326, 0.3009350317044362, 2.9393271195314403]]

overlaps  [[0, 0, 0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.04444444444444444]
MIN_overlap  [0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]

alpha shape  (1780,)

alpha  [-1.07691633e-14 -2.27595720e-15  3.55271368e-15 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1]

approximation 
 [ 0.29460647  0.26154552  0.15166271  0.06404772  0.27632569  0.26214027
  0.0662502   0.36358289 -0.11946059  0.0464381   0.01854094  0.04855094
  0.18748532  0.23275562  0.09336656  0.14861169 -0.07188396  0.26429701
  0.25695328  0.02928896 -0.02425904 -0.03669311  0.03753481 -0.04579261
  0.1755991   0.11111902  0.12441864  0.09863437  0.04408134 -0.00319092
  0.10785693  0.14378612  0.15423443  0.22149296  0.18858163  0.24732777
  0.22918505  0.03903969  0.11364125  0.03638176  0.10516324  0.13075289
  0.16691474  0.15845844  0.21028699  0.1396813   0.19539142  0.18311723
  0.14920677  0.11135566  0.13856716  0.09071406  0.15852427  0.20193421
  0.10971759  0.1307273   0.07389024  0.17125954  0.1539964   0.19532083
  0.06688485  0.16907676  0.26326138 -0.13421286 -0.1438237   0.09592797
  0.11473727  0.49659146 -0.21825259 -0.29328978  0.24483943  0.31037272
  0.15512905  0.23838402  0.0052727   0.94296408  0.05530935  0.12731877
  0.49425189  0.01364925  0.19993214  0.13238042  0.09029695  0.15918017
  0.04425801  0.21528888  0.12638944  0.12742227  0.16109531  0.11095623
  0.20162665  0.13362489  0.1379822   0.19320376  0.21949194  0.14285242
  0.2348726   0.0165588   0.20327844  0.08712908  0.2599169   0.26482987
  0.15237018 -0.07610492  0.32128748  0.36857636  0.15807833  0.57481906
  0.11623678  0.07355014  0.31163284 -0.03141206  0.2228866   0.19222232
 -0.38986728  0.29224754  0.25772187  0.06850257  0.60738344  0.15623396
  0.31872549  0.13011788  0.38980602  0.39711702  0.10857213  0.15323948
  0.02470441  0.37981967 -0.10668708 -0.12461268  0.01400579  0.21250035
  0.21723868  0.14090335 -0.06274862  0.35744731  0.04840233 -0.00507772
  0.19521858  0.11260778 -0.00325429  0.23126801  0.28561181 -0.20036265
  0.14120207 -0.0422207   0.10848508  0.01291519  0.08525737  0.23445717
  0.21546837  0.03170333 -0.0172853   0.18555447  0.07627468  0.41538901
  0.20463698 -0.20461157  0.22940093 -0.06843056  0.17258133  0.1685129
 -0.11660599  0.00144947  0.32458452  0.03476273  0.22895491  0.37259952
  0.25066319 -0.00968936 -0.01847349  0.13660902  0.24002392  0.17061751
 -0.16412518  0.16563424  0.17897402  0.39472868  0.28179192  0.04473702
  0.2898678  -0.01229836  0.07151382 -0.06867019  0.60629556  0.44177721
  0.18724027  0.77823022  0.43587023  0.03802579  0.67662492 -0.07182097
  0.20703016  0.36218308  0.01568467  0.25143228  0.05023912  0.18077144
  0.06239029  0.3369741 ]

approx error on U on val data  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.23631374392237953, 0.24953571779089506, 0.27812678886179165, 0.231729063686265, 0.18605605800596603], [0.2359801995957806, 0.17454631451517882, 0.28965629609284577, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626528, 0.18521048307998722, 0.2418033779932797], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 0.27373550017628107]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
 -8.18964730e-03  2.23412999e-01  8.40408620e-02  6.66026354e-02
  2.01848981e-01  7.66348409e-02  3.52346536e-01  2.30140100e-01
 -3.49897168e-02  3.46138069e-01 -2.06833112e-01  1.60886773e-01
  9.90572821e-02 -1.22816796e-02  2.37940941e-01  3.36168545e-01
  3.69262779e-01  5.23195853e-02  3.83718621e-01 -8.17969926e-02
  3.41174882e-01  4.27342825e-01  9.60558839e-02  2.90891459e-01
  1.87647794e-01  1.43162203e-01 -1.93328239e-01  4.77217533e-01
  3.42479558e-01  3.58951543e-01  1.72823565e-01  1.84886145e-01
  8.99032858e-02  6.07586023e-01  3.88476546e-02  2.60105419e-01
  2.54445159e-01  4.90434536e-01  4.37961350e-01 -5.61842522e-02
  4.69172899e-02  1.12531135e-01  2.51906234e-01  2.27303786e-01
  4.42594320e-01  3.67878111e-01  2.41234922e-01  1.42938811e-01
  2.71570660e-01  1.43754386e-01 -4.78895460e-02  4.42962469e-01
  2.47447647e-01  2.18490960e-01  2.36476008e-01  6.63170507e-01
  4.18684120e-01  1.08931876e-01  4.49261462e-01  9.14574355e-02
 -4.41639607e-02  2.20889435e-01  1.00238376e-01 -5.19048440e-02
  6.73115685e-02  3.32964145e-01  4.20749825e-01  4.87195350e-01
  2.01226009e-01  2.02634468e-01  4.12733823e-02 -1.04697161e-01
  2.81608203e-02  2.53742618e-01 -1.82382433e-01  1.24575046e-01
  2.30995508e-01  3.54613855e-02  4.76892199e-01  7.19137772e-02]

approx error on V on Val data  [[0.3005124651610089, 0.24023287340261276, 0.23399029277961225, 0.25178130044886216, 0.22510235601642412], [0.23279130874330747, 0.3255196536083326, 0.21561693928679454, 0.2281330790423822, 0.22882604559648972], [0.23279130874329965, 0.23322780833832102, 0.32551965360833346, 0.32449813576447795, 0.22510235601641848]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [01:08<00:00, 68.20s/it]
predicting: 100%|██████████| 1/1 [01:08<00:00, 68.20s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [01:06<04:27, 66.88s/it]
predicting:  40%|████      | 2/5 [02:12<03:19, 66.38s/it]
predicting:  60%|██████    | 3/5 [03:14<02:08, 64.40s/it]
predicting:  80%|████████  | 4/5 [04:33<01:10, 70.15s/it]
predicting: 100%|██████████| 5/5 [05:41<00:00, 69.30s/it]
predicting: 100%|██████████| 5/5 [05:41<00:00, 68.34s/it]

***********************************
S_worst_ind  9

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

AVG_LLM_loss_on_VAL_data  [0.15499999999999997, 0.15, 0.15499999999999997, 0.14499999999999996]

MIN_LLM_loss_on_VAL_data  [0.1, 0.1, 0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.2, 0.25, 0.2]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.2, 0.15, 0.15, 0.15, 0.15], [0.15, 0.25, 0.15, 0.15, 0.25], [0.15, 0.15, 0.25, 0.25, 0.15], [0.25, 0.15, 0.2, 0.2, 0.2]]

AVG_LLM_loss_on_VAL_data  [0.16, 0.19, 0.19, 0.2]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.2, 0.25, 0.25, 0.25]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.29460647  0.26154552  0.15166271  0.06404772  0.27632569  0.26214027
  0.0662502   0.36358289 -0.11946059  0.0464381   0.01854094  0.04855094
  0.18748532  0.23275562  0.09336656  0.14861169 -0.07188396  0.26429701
  0.25695328  0.02928896 -0.02425904 -0.03669311  0.03753481 -0.04579261
  0.1755991   0.11111902  0.12441864  0.09863437  0.04408134 -0.00319092
  0.10785693  0.14378612  0.15423443  0.22149296  0.18858163  0.24732777
  0.22918505  0.03903969  0.11364125  0.03638176  0.10516324  0.13075289
  0.16691474  0.15845844  0.21028699  0.1396813   0.19539142  0.18311723
  0.14920677  0.11135566  0.13856716  0.09071406  0.15852427  0.20193421
  0.10971759  0.1307273   0.07389024  0.17125954  0.1539964   0.19532083
  0.06688485  0.16907676  0.26326138 -0.13421286 -0.1438237   0.09592797
  0.11473727  0.49659146 -0.21825259 -0.29328978  0.24483943  0.31037272
  0.15512905  0.23838402  0.0052727   0.94296408  0.05530935  0.12731877
  0.49425189  0.01364925  0.19993214  0.13238042  0.09029695  0.15918017
  0.04425801  0.21528888  0.12638944  0.12742227  0.16109531  0.11095623
  0.20162665  0.13362489  0.1379822   0.19320376  0.21949194  0.14285242
  0.2348726   0.0165588   0.20327844  0.08712908  0.2599169   0.26482987
  0.15237018 -0.07610492  0.32128748  0.36857636  0.15807833  0.57481906
  0.11623678  0.07355014  0.31163284 -0.03141206  0.2228866   0.19222232
 -0.38986728  0.29224754  0.25772187  0.06850257  0.60738344  0.15623396
  0.31872549  0.13011788  0.38980602  0.39711702  0.10857213  0.15323948
  0.02470441  0.37981967 -0.10668708 -0.12461268  0.01400579  0.21250035
  0.21723868  0.14090335 -0.06274862  0.35744731  0.04840233 -0.00507772
  0.19521858  0.11260778 -0.00325429  0.23126801  0.28561181 -0.20036265
  0.14120207 -0.0422207   0.10848508  0.01291519  0.08525737  0.23445717
  0.21546837  0.03170333 -0.0172853   0.18555447  0.07627468  0.41538901
  0.20463698 -0.20461157  0.22940093 -0.06843056  0.17258133  0.1685129
 -0.11660599  0.00144947  0.32458452  0.03476273  0.22895491  0.37259952
  0.25066319 -0.00968936 -0.01847349  0.13660902  0.24002392  0.17061751
 -0.16412518  0.16563424  0.17897402  0.39472868  0.28179192  0.04473702
 -1.07577988 -1.02413272 -1.01911671 -0.9880703  -1.02636203 -0.91689274
 -0.99576868 -1.05958247 -0.85865823 -1.0393448  -0.96775726 -1.00375859
 -0.98627499 -1.01808174 -1.08617059 -0.84593926 -1.03035409 -0.96641743
 -0.92786564 -0.94328147]

approx error on U for Validation Data after updating U  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.24953571779089506, 0.27812678886179165, 0.23172906368626495, 0.18605605800596603, 3.4120111785147245], [0.2359801995957806, 0.17454631451517882, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626522, 0.18521048307998722, 0.24180337799327972, 2.4155314944857715], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 1.1389804813710553]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.34117488  0.42734282  0.09605588  0.29089146  0.18764779  0.1431622
 -0.19332824  0.47721753  0.34247956  0.35895154  0.17282356  0.18488614
  0.08990329  0.60758602  0.03884765  0.26010542  0.25444516  0.49043454
  0.43796135 -0.05618425  1.13679373  1.16154372  1.21952842  1.10089
  1.14154842  1.08080851  1.22849285  1.15812401  1.06711572  1.10928183
  1.06043322  1.11318937  1.16785509  1.13999817  1.12885917  1.02953844
  1.20958443  1.06476325  1.15465443  1.02831273  0.04691729  0.11253113
  0.25190623  0.22730379  0.44259432  0.36787811  0.24123492  0.14293881
  0.27157066  0.14375439 -0.04788955  0.44296247  0.24744765  0.21849096
  0.23647601  0.66317051  0.41868412  0.10893188  0.44926146  0.09145744
  1.11152363  1.07202591  0.9793857   0.99959329  0.93456756  1.03885093
  0.97277164  1.04034898  0.86540629  1.05914044  0.89237209  1.0321424
  0.90443808  0.94266427  1.05598434  0.90180893  1.05534998  0.89774997
  0.9576291   0.77879302  4.59343723  4.6984694   4.47180179  4.86409533
  4.8002235   4.71415386  4.53748681  4.7999801   4.61874989  4.58398277
  5.03012435  4.35087015  4.63377668  4.62917138  4.54003636  4.17923561
  5.06427104  4.52591518  4.73591647  4.27348336]

approx error on V for Validation Data after updating V  [[0.23399029277961225, 0.3176322143187879, 0.9750657754550233, 0.8386835253310766, 0.2517734533343107], [0.23279130874330747, 0.2532403830116759, 0.3255196536083326, 0.3009350317044362, 2.9393271195314403], [0.32551965360833346, 0.975065775455022, 0.2888735743383605, 0.7886835253310979, 4.432259062249502]]

overlaps  [[0, 0, 0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444]
MIN_overlap  [0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]

alpha shape  (1780,)

alpha  [-7.74380560e-15 -2.45359288e-14  4.88498131e-15 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.29474752  0.26140551  0.15166997  0.06417374  0.27633096  0.26217265
  0.06625663  0.36374912 -0.1193932   0.04675635  0.0184437   0.04871911
  0.18726002  0.23265192  0.0933486   0.1485423  -0.07197573  0.26426549
  0.25663843  0.02931355 -0.02425904 -0.03669311  0.03753481 -0.04579261
  0.1755991   0.11111902  0.12441864  0.09863437  0.04408134 -0.00319092
  0.10785693  0.14378612  0.15423443  0.22149296  0.18858163  0.24732777
  0.22918505  0.03903969  0.11364125  0.03638176  0.11189787  0.1355586
  0.17084802  0.15397057  0.20374802  0.14889881  0.19230323  0.18992957
  0.14332038  0.11546806  0.13476083  0.09670474  0.15521331  0.19284344
  0.11380536  0.12605613  0.07875329  0.16290015  0.14998079  0.19671071
  0.06688485  0.16907676  0.26326138 -0.13421286 -0.1438237   0.09592797
  0.11473727  0.49659146 -0.21825259 -0.29328978  0.24483943  0.31037272
  0.15512905  0.23838402  0.0052727   0.94296408  0.05530935  0.12731877
  0.49425189  0.01364925  0.28008694  0.0922511   0.01962456  0.17257349
 -0.05984917  0.22826001  0.12354377  0.16248567  0.13909538  0.16980159
  0.21351702  0.13432133  0.11163518  0.28585696  0.26925594  0.13451707
  0.28778362 -0.05143908  0.2221561  -0.00375761  0.2599169   0.26482987
  0.15237018 -0.07610492  0.32128748  0.36857636  0.15807833  0.57481906
  0.11623678  0.07355014  0.31163284 -0.03141206  0.2228866   0.19222232
 -0.38986728  0.29224754  0.25772187  0.06850257  0.60738344  0.15623396
  0.31872549  0.13011788  0.38980602  0.39711702  0.10857213  0.15323948
  0.02470441  0.37981967 -0.10668708 -0.12461268  0.01400579  0.21250035
  0.21723868  0.14090335 -0.06274862  0.35744731  0.04840233 -0.00507772
  0.19521858  0.11260778 -0.00323819  0.23120138  0.28559248 -0.20045119
  0.14121269 -0.04217215  0.10853526  0.01282642  0.08522231  0.23444708
  0.21545305  0.03184983 -0.01721139  0.1855639   0.07638285  0.41542744
  0.20470054 -0.20467531  0.22929592 -0.06848316  0.10942937  0.15483544
 -0.09717226  0.05110297  0.28383136  0.07677064  0.19577427  0.34981491
  0.2475052  -0.0599725  -0.00494679  0.11951154  0.22240687  0.18912904
 -0.05900785  0.13325213  0.16286669  0.37910789  0.30192719  0.12156679
  0.16164991  0.29544086  0.13775563  0.13112962  0.17278665  0.18080577
  0.26224149  0.32787445  0.14453673  0.23374154  0.00760294 -0.08202858
  0.02050807  0.07181483 -0.11142615  0.05341139  0.28816602  0.1903859
  0.3577087   0.01370444]

approx error on U on val data  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.23631374392237953, 0.24953571779089506, 0.27812678886179165, 0.231729063686265, 0.18605605800596603], [0.2359801995957806, 0.17454631451517882, 0.28965629609284577, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626528, 0.18521048307998722, 0.2418033779932797], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 0.27373550017628107], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.278126788861791, 0.23172906368626486, 0.1852247909918996, 0.23749716158177522, 0.23833652926205703]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.34117488  0.42734282  0.09605588  0.29089146  0.18764779  0.1431622
 -0.19332824  0.47721753  0.34247956  0.35895154  0.17282356  0.18488614
  0.08990329  0.60758602  0.03884765  0.26010542  0.25444516  0.49043454
  0.43796135 -0.05618425 -0.02062577  0.23284108  0.10017538  0.01781548
  0.0839879   0.10202314  0.24593054  0.267743    0.19868087  0.00645207
  0.18667517  0.04811472  0.06626106  0.00120838  0.00486321  0.06087003
  0.4054603   0.1039967   0.50654775  0.32238782 -0.13931926 -0.04365052
  0.04413684  0.15553945  0.59433572  0.34540291  0.16596669  0.05193822
  0.38569279 -0.00559306  0.14610908  0.34030997 -0.03494586  0.11750617
  0.09332366  0.77539562  0.40580988 -0.07045674  0.54327488  0.19745645
  0.18702938  0.2791499   0.20069631  0.24092122  0.15402713  0.47439562
  0.45650754  0.46194482  0.44240588  0.0724774  -0.02837153 -0.0664892
  0.00728383 -0.13503514  0.08784259  0.4510652   0.29563227 -0.33721351
  0.60872674  0.06494805  0.41963103  0.35486665  0.34755394  0.12646061
  0.66822045  0.19372469  0.16839249  0.77327828  0.23837812  0.15024023
  0.06944573 -0.02876462  0.02659596  0.22670189 -0.15309126  0.29583235
  0.02389544  0.01921349  0.06565705 -0.02159279]

approx error on V on Val data  [[0.3005124651610089, 0.24023287340261276, 0.23399029277961225, 0.25178130044886216, 0.22510235601642412], [0.23279130874330747, 0.3255196536083326, 0.21561693928679454, 0.2281330790423822, 0.22882604559648972], [0.23279130874329965, 0.23322780833832102, 0.32551965360833346, 0.32449813576447795, 0.22510235601641848], [0.3255196536083328, 0.21561693928679473, 0.2363137439223792, 0.2529949237066617, 0.23827803939343753]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [01:02<00:00, 62.43s/it]
predicting: 100%|██████████| 1/1 [01:02<00:00, 62.43s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [01:09<04:39, 69.75s/it]
predicting:  40%|████      | 2/5 [02:19<03:29, 69.76s/it]
predicting:  60%|██████    | 3/5 [03:16<02:07, 63.88s/it]
predicting:  80%|████████  | 4/5 [04:28<01:07, 67.17s/it]
predicting: 100%|██████████| 5/5 [05:37<00:00, 67.89s/it]
predicting: 100%|██████████| 5/5 [05:37<00:00, 67.56s/it]

***********************************
S_worst_ind  5

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0]

AVG_LLM_loss_on_VAL_data  [0.15499999999999997, 0.15, 0.15499999999999997, 0.14499999999999996, 0.145]

MIN_LLM_loss_on_VAL_data  [0.1, 0.1, 0.1, 0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.2, 0.25, 0.2, 0.2]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.2, 0.15, 0.15, 0.15, 0.15], [0.15, 0.25, 0.15, 0.15, 0.25], [0.15, 0.15, 0.25, 0.25, 0.15], [0.25, 0.15, 0.2, 0.2, 0.2], [0.15, 0.2, 0.25, 0.25, 0.15]]

AVG_LLM_loss_on_VAL_data  [0.16, 0.19, 0.19, 0.2, 0.2]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.2, 0.25, 0.25, 0.25, 0.25]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0]

approximation 
 [ 2.94747521e-01  2.61405515e-01  1.51669975e-01  6.41737428e-02
  2.76330960e-01  2.62172649e-01  6.62566332e-02  3.63749125e-01
 -1.19393196e-01  4.67563512e-02  1.84436985e-02  4.87191072e-02
  1.87260024e-01  2.32651924e-01  9.33486039e-02  1.48542298e-01
 -7.19757320e-02  2.64265495e-01  2.56638428e-01  2.93135472e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  1.11897868e-01  1.35558598e-01  1.70848022e-01  1.53970567e-01
  2.03748025e-01  1.48898814e-01  1.92303234e-01  1.89929565e-01
  1.43320385e-01  1.15468064e-01  1.34760828e-01  9.67047448e-02
  1.55213314e-01  1.92843440e-01  1.13805363e-01  1.26056127e-01
  7.87532901e-02  1.62900147e-01  1.49980791e-01  1.96710707e-01
  6.68848479e-02  1.69076763e-01  2.63261380e-01 -1.34212859e-01
 -1.43823703e-01  9.59279705e-02  1.14737274e-01  4.96591460e-01
 -2.18252586e-01 -2.93289775e-01  2.44839428e-01  3.10372718e-01
  1.55129046e-01  2.38384017e-01  5.27269979e-03  9.42964078e-01
  5.53093472e-02  1.27318774e-01  4.94251886e-01  1.36492500e-02
  2.80086944e-01  9.22510997e-02  1.96245600e-02  1.72573491e-01
 -5.98491672e-02  2.28260012e-01  1.23543774e-01  1.62485667e-01
  1.39095376e-01  1.69801590e-01  2.13517020e-01  1.34321327e-01
  1.11635176e-01  2.85856962e-01  2.69255936e-01  1.34517072e-01
  2.87783618e-01 -5.14390810e-02  2.22156098e-01 -3.75760959e-03
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.23819461e-03  2.31201379e-01  2.85592481e-01 -2.00451192e-01
  1.41212692e-01 -4.21721476e-02  1.08535257e-01  1.28264176e-02
  8.52223089e-02  2.34447083e-01  2.15453045e-01  3.18498303e-02
 -1.72113900e-02  1.85563898e-01  7.63828484e-02  4.15427438e-01
  2.04700538e-01 -2.04675314e-01  2.29295924e-01 -6.84831643e-02
  1.09429365e-01  1.54835439e-01 -9.71722649e-02  5.11029654e-02
  2.83831358e-01  7.67706421e-02  1.95774270e-01  3.49814910e-01
  2.47505204e-01 -5.99725004e-02 -4.94678724e-03  1.19511540e-01
  2.22406866e-01  1.89129043e-01 -5.90078547e-02  1.33252132e-01
  1.62866686e-01  3.79107889e-01  3.01927195e-01  1.21566795e-01
  1.61649912e-01  2.95440858e-01  1.37755627e-01  1.31129624e-01
  1.72786645e-01  1.80805768e-01  2.62241486e-01  3.27874447e-01
  1.44536731e-01  2.33741542e-01  7.60294215e-03 -8.20285806e-02
  2.05080658e-02  7.18148304e-02 -1.11426149e-01  5.34113937e-02
  2.88166020e-01  1.90385902e-01  3.57708700e-01  1.37044433e-02
 -3.79920517e+00 -3.88635293e+00 -3.83877424e+00 -3.70109074e+00
 -3.63190090e+00 -3.59658092e+00 -3.64201433e+00 -3.74709848e+00
 -3.34402897e+00 -3.80840744e+00 -3.49183379e+00 -3.64957609e+00
 -3.92646255e+00 -3.79172563e+00 -3.69745210e+00 -3.44879456e+00
 -3.92709837e+00 -3.52297192e+00 -3.58202300e+00 -2.98106257e+00]

approx error on U for Validation Data after updating U  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.24953571779089506, 0.27812678886179165, 0.23172906368626495, 0.18605605800596603, 3.4120111785147245], [0.2359801995957806, 0.17454631451517882, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626522, 0.18521048307998722, 0.24180337799327972, 2.4155314944857715], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 1.1389804813710553], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.231729063686265, 0.1852247909918996, 0.23749716158177522, 0.2383365292620571, 3.850722734294883]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.18702938  0.2791499   0.20069631  0.24092122  0.15402713  0.47439562
  0.45650754  0.46194482  0.44240588  0.0724774  -0.02837153 -0.0664892
  0.00728383 -0.13503514  0.08784259  0.4510652   0.29563227 -0.33721351
  0.60872674  0.06494805  0.41963103  0.35486665  0.34755394  0.12646061
  0.66822045  0.19372469  0.16839249  0.77327828  0.23837812  0.15024023
  0.06944573 -0.02876462  0.02659596  0.22670189 -0.15309126  0.29583235
  0.02389544  0.01921349  0.06565705 -0.02159279 -0.13931926 -0.04365052
  0.04413684  0.15553945  0.59433572  0.34540291  0.16596669  0.05193822
  0.38569279 -0.00559306  0.14610908  0.34030997 -0.03494586  0.11750617
  0.09332366  0.77539562  0.40580988 -0.07045674  0.54327488  0.19745645
  0.34117488  0.42734282  0.09605588  0.29089146  0.18764779  0.1431622
 -0.19332824  0.47721753  0.34247956  0.35895154  0.17282356  0.18488614
  0.08990329  0.60758602  0.03884765  0.26010542  0.25444516  0.49043454
  0.43796135 -0.05618425  2.91891798  2.99660645  3.1567718   2.83684892
  2.9412813   2.79511262  3.1857847   2.98053961  2.7684814   2.85023505
  2.72958409  2.86766711  3.02106703  2.93858399  2.89517342  2.66769316
  3.12735008  2.74172631  2.9958201   2.64599397]

approx error on V for Validation Data after updating V  [[0.23399029277961225, 0.3176322143187879, 0.9750657754550233, 0.8386835253310766, 0.2517734533343107], [0.23279130874330747, 0.2532403830116759, 0.3255196536083326, 0.3009350317044362, 2.9393271195314403], [0.32551965360833346, 0.975065775455022, 0.2888735743383605, 0.7886835253310979, 4.432259062249502], [0.25043448591242407, 0.23827803939343753, 0.25177345333431034, 0.3255196536083328, 2.7530619541388135]]

overlaps  [[0, 0, 0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 1, 0, 0], [0, 0, 0, 0, 0, 0, 1, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.04444444444444444]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]

alpha shape  (1780,)

alpha  [ 8.77076189e-15 -6.88338275e-15  1.33226763e-15 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0]

approximation 
 [ 2.94639028e-01  2.61513208e-01  1.51664388e-01  6.40768062e-02
  2.76326904e-01  2.62147741e-01  6.62516853e-02  3.63621255e-01
 -1.19445038e-01  4.65115540e-02  1.85184953e-02  4.85897522e-02
  1.87433318e-01  2.32731686e-01  9.33624135e-02  1.48595674e-01
 -7.19051433e-02  2.64289738e-01  2.56880608e-01  2.92946313e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  1.11897868e-01  1.35558598e-01  1.70848022e-01  1.53970567e-01
  2.03748025e-01  1.48898814e-01  1.92303234e-01  1.89929565e-01
  1.43320385e-01  1.15468064e-01  1.34760828e-01  9.67047448e-02
  1.55213314e-01  1.92843440e-01  1.13805363e-01  1.26056127e-01
  7.87532901e-02  1.62900147e-01  1.49980791e-01  1.96710707e-01
  6.68848479e-02  1.69076763e-01  2.63261380e-01 -1.34212859e-01
 -1.43823703e-01  9.59279705e-02  1.14737274e-01  4.96591460e-01
 -2.18252586e-01 -2.93289775e-01  2.44839428e-01  3.10372718e-01
  1.55129046e-01  2.38384017e-01  5.27269979e-03  9.42964078e-01
  5.53093472e-02  1.27318774e-01  4.94251886e-01  1.36492500e-02
  2.18431896e-01  1.23118561e-01  7.39857366e-02  1.62271357e-01
  2.02300365e-02  2.18282623e-01  1.25732661e-01  1.35514912e-01
  1.56017717e-01  1.24537761e-01  2.04370955e-01  1.33785628e-01
  1.31901294e-01  2.14588152e-01  2.30977484e-01  1.40928622e-01
  2.47084483e-01  8.64868734e-04  2.07635408e-01  6.61524035e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.23819461e-03  2.31201379e-01  2.85592481e-01 -2.00451192e-01
  1.41212692e-01 -4.21721476e-02  1.08535257e-01  1.28264176e-02
  8.52223089e-02  2.34447083e-01  2.15453045e-01  3.18498303e-02
 -1.72113900e-02  1.85563898e-01  7.63828484e-02  4.15427438e-01
  2.04700538e-01 -2.04675314e-01  2.29295924e-01 -6.84831643e-02
  1.09429365e-01  1.54835439e-01 -9.71722649e-02  5.11029654e-02
  2.83831358e-01  7.67706421e-02  1.95774270e-01  3.49814910e-01
  2.47505204e-01 -5.99725004e-02 -4.94678724e-03  1.19511540e-01
  2.22406866e-01  1.89129043e-01 -5.90078547e-02  1.33252132e-01
  1.62866686e-01  3.79107889e-01  3.01927195e-01  1.21566795e-01
  1.28782397e-01  2.93970598e-01  1.48404110e-01  1.37539864e-01
  1.71190730e-01  2.06882995e-01  2.71069494e-01  2.89404929e-01
  1.73958511e-01  2.27653863e-01  8.91941700e-03 -8.22813263e-02
  3.22251156e-02  6.14419831e-02 -1.27343473e-01  7.63446129e-02
  2.99336844e-01  1.79201504e-01  3.76757819e-01 -1.88562165e-02
  2.68220018e-01  2.43523275e-01  3.35641571e-01 -1.49349034e-01
  4.00252257e-01  3.50512152e-01  2.72954182e-01  3.99695455e-01
  2.77836932e-01  3.33037008e-01  2.39042228e-01  2.28795837e-02
  1.28932539e-01  4.68700592e-01 -2.48726931e-03  2.83401311e-01
 -1.53553182e-01  2.61378083e-01 -1.07795938e-02 -1.05264835e-01]

approx error on U on val data  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.23631374392237953, 0.24953571779089506, 0.27812678886179165, 0.231729063686265, 0.18605605800596603], [0.2359801995957806, 0.17454631451517882, 0.28965629609284577, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626528, 0.18521048307998722, 0.2418033779932797], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 0.27373550017628107], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.278126788861791, 0.23172906368626486, 0.1852247909918996, 0.23749716158177522, 0.23833652926205703], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 0.28016709348472546]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.06123854  0.23213717  0.19868879  0.18413946  0.11698954  0.32692662
  0.39150527  0.33008012  0.33605328  0.06673405 -0.02466423 -0.15055021
  0.01572241 -0.11839228  0.01249466  0.36029361  0.24439785 -0.25142064
  0.53934861  0.06032893  0.32393699  0.25715955  0.24116577  0.17096456
  0.58705593  0.20150143  0.11264117  0.71891726  0.42415958  0.05271725
  0.34579719 -0.17641263  0.03348623  0.27939745 -0.19938884  0.2567224
  0.14372511  0.02932912  0.09160933  0.15599326 -0.09009675  0.17567712
  0.20257329  0.22966054  0.53830568  0.74737899  0.33175624  0.15952263
  0.58536943 -0.0328586  -0.22411142  0.43395371  0.23990884  0.06614192
 -0.09210979  0.83844261  0.23570536  0.06775845  0.5431578   0.10476545
  0.34117488  0.42734282  0.09605588  0.29089146  0.18764779  0.1431622
 -0.19332824  0.47721753  0.34247956  0.35895154  0.17282356  0.18488614
  0.08990329  0.60758602  0.03884765  0.26010542  0.25444516  0.49043454
  0.43796135 -0.05618425 -0.01983035  0.21646632  0.07964779 -0.05090739
  0.07004989  0.30801812  0.40471465  0.52196656  0.17710063  0.21834012
  0.05363921 -0.11217425  0.0158257   0.26562098 -0.16155699  0.09761627
  0.22100022  0.05281949  0.45042835  0.10844107]

approx error on V on Val data  [[0.3005124651610089, 0.24023287340261276, 0.23399029277961225, 0.25178130044886216, 0.22510235601642412], [0.23279130874330747, 0.3255196536083326, 0.21561693928679454, 0.2281330790423822, 0.22882604559648972], [0.23279130874329965, 0.23322780833832102, 0.32551965360833346, 0.32449813576447795, 0.22510235601641848], [0.3255196536083328, 0.21561693928679473, 0.2363137439223792, 0.2529949237066617, 0.23827803939343753], [0.22813307904238087, 0.2746735597137788, 0.26428196002047094, 0.3255196536083333, 0.22330710034162826]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:52<00:00, 52.11s/it]
predicting: 100%|██████████| 1/1 [00:52<00:00, 52.12s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [01:10<04:41, 70.41s/it]
predicting:  40%|████      | 2/5 [02:11<03:14, 64.90s/it]
predicting:  60%|██████    | 3/5 [03:16<02:10, 65.10s/it]
predicting:  80%|████████  | 4/5 [04:15<01:02, 62.49s/it]
predicting: 100%|██████████| 5/5 [05:24<00:00, 64.91s/it]
predicting: 100%|██████████| 5/5 [05:24<00:00, 64.89s/it]

***********************************
S_worst_ind  9

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

AVG_LLM_loss_on_VAL_data  [0.15499999999999997, 0.15, 0.15499999999999997, 0.14499999999999996, 0.145, 0.13999999999999999]

MIN_LLM_loss_on_VAL_data  [0.1, 0.1, 0.1, 0.1, 0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.2, 0.25, 0.2, 0.2, 0.15]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.2, 0.15, 0.15, 0.15, 0.15], [0.15, 0.25, 0.15, 0.15, 0.25], [0.15, 0.15, 0.25, 0.25, 0.15], [0.25, 0.15, 0.2, 0.2, 0.2], [0.15, 0.2, 0.25, 0.25, 0.15], [0.2, 0.15, 0.25, 0.2, 0.15]]

AVG_LLM_loss_on_VAL_data  [0.16, 0.19, 0.19, 0.2, 0.2, 0.19]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.2, 0.25, 0.25, 0.25, 0.25, 0.25]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.94639028e-01  2.61513208e-01  1.51664388e-01  6.40768062e-02
  2.76326904e-01  2.62147741e-01  6.62516853e-02  3.63621255e-01
 -1.19445038e-01  4.65115540e-02  1.85184953e-02  4.85897522e-02
  1.87433318e-01  2.32731686e-01  9.33624135e-02  1.48595674e-01
 -7.19051433e-02  2.64289738e-01  2.56880608e-01  2.92946313e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  1.11897868e-01  1.35558598e-01  1.70848022e-01  1.53970567e-01
  2.03748025e-01  1.48898814e-01  1.92303234e-01  1.89929565e-01
  1.43320385e-01  1.15468064e-01  1.34760828e-01  9.67047448e-02
  1.55213314e-01  1.92843440e-01  1.13805363e-01  1.26056127e-01
  7.87532901e-02  1.62900147e-01  1.49980791e-01  1.96710707e-01
  6.68848479e-02  1.69076763e-01  2.63261380e-01 -1.34212859e-01
 -1.43823703e-01  9.59279705e-02  1.14737274e-01  4.96591460e-01
 -2.18252586e-01 -2.93289775e-01  2.44839428e-01  3.10372718e-01
  1.55129046e-01  2.38384017e-01  5.27269979e-03  9.42964078e-01
  5.53093472e-02  1.27318774e-01  4.94251886e-01  1.36492500e-02
  2.18431896e-01  1.23118561e-01  7.39857366e-02  1.62271357e-01
  2.02300365e-02  2.18282623e-01  1.25732661e-01  1.35514912e-01
  1.56017717e-01  1.24537761e-01  2.04370955e-01  1.33785628e-01
  1.31901294e-01  2.14588152e-01  2.30977484e-01  1.40928622e-01
  2.47084483e-01  8.64868734e-04  2.07635408e-01  6.61524035e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.23819461e-03  2.31201379e-01  2.85592481e-01 -2.00451192e-01
  1.41212692e-01 -4.21721476e-02  1.08535257e-01  1.28264176e-02
  8.52223089e-02  2.34447083e-01  2.15453045e-01  3.18498303e-02
 -1.72113900e-02  1.85563898e-01  7.63828484e-02  4.15427438e-01
  2.04700538e-01 -2.04675314e-01  2.29295924e-01 -6.84831643e-02
  1.09429365e-01  1.54835439e-01 -9.71722649e-02  5.11029654e-02
  2.83831358e-01  7.67706421e-02  1.95774270e-01  3.49814910e-01
  2.47505204e-01 -5.99725004e-02 -4.94678724e-03  1.19511540e-01
  2.22406866e-01  1.89129043e-01 -5.90078547e-02  1.33252132e-01
  1.62866686e-01  3.79107889e-01  3.01927195e-01  1.21566795e-01
  1.28782397e-01  2.93970598e-01  1.48404110e-01  1.37539864e-01
  1.71190730e-01  2.06882995e-01  2.71069494e-01  2.89404929e-01
  1.73958511e-01  2.27653863e-01  8.91941700e-03 -8.22813263e-02
  3.22251156e-02  6.14419831e-02 -1.27343473e-01  7.63446129e-02
  2.99336844e-01  1.79201504e-01  3.76757819e-01 -1.88562165e-02
 -9.70106350e-01 -1.02660879e+00 -1.01925023e+00 -1.06773902e+00
 -9.79585399e-01 -9.77255774e-01 -9.37548946e-01 -9.47579124e-01
 -9.35110455e-01 -9.45213659e-01 -9.80219285e-01 -9.71899621e-01
 -1.01763075e+00 -9.84381378e-01 -1.06837175e+00 -8.72337225e-01
 -1.01071594e+00 -9.53718561e-01 -9.43720757e-01 -9.21649374e-01]

approx error on U for Validation Data after updating U  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.24953571779089506, 0.27812678886179165, 0.23172906368626495, 0.18605605800596603, 3.4120111785147245], [0.2359801995957806, 0.17454631451517882, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626522, 0.18521048307998722, 0.24180337799327972, 2.4155314944857715], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 1.1389804813710553], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.231729063686265, 0.1852247909918996, 0.23749716158177522, 0.2383365292620571, 3.850722734294883], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 1.1265321190341384]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.26822002  0.24352327  0.33564157 -0.14934903  0.40025226  0.35051215
  0.27295418  0.39969546  0.27783693  0.33303701  0.23904223  0.02287958
  0.12893254  0.46870059 -0.00248727  0.28340131 -0.15355318  0.26137808
 -0.01077959 -0.10526484 -0.09009675  0.17567712  0.20257329  0.22966054
  0.53830568  0.74737899  0.33175624  0.15952263  0.58536943 -0.0328586
 -0.22411142  0.43395371  0.23990884  0.06614192 -0.09210979  0.83844261
  0.23570536  0.06775845  0.5431578   0.10476545  0.34117488  0.42734282
  0.09605588  0.29089146  0.18764779  0.1431622  -0.19332824  0.47721753
  0.34247956  0.35895154  0.17282356  0.18488614  0.08990329  0.60758602
  0.03884765  0.26010542  0.25444516  0.49043454  0.43796135 -0.05618425
  0.89127113  0.91067567  0.95613694  0.86312182  0.89499891  0.8473775
  0.96316525  0.90799454  0.83664205  0.8697012   0.83140282  0.87276479
  0.91562392  0.89378348  0.88505026  0.80718064  0.94834063  0.83479766
  0.90527431  0.80621966  0.32393699  0.25715955  0.24116577  0.17096456
  0.58705593  0.20150143  0.11264117  0.71891726  0.42415958  0.05271725
  0.34579719 -0.17641263  0.03348623  0.27939745 -0.19938884  0.2567224
  0.14372511  0.02932912  0.09160933  0.15599326]

approx error on V for Validation Data after updating V  [[0.23399029277961225, 0.3176322143187879, 0.9750657754550233, 0.8386835253310766, 0.2517734533343107], [0.23279130874330747, 0.2532403830116759, 0.3255196536083326, 0.3009350317044362, 2.9393271195314403], [0.32551965360833346, 0.975065775455022, 0.2888735743383605, 0.7886835253310979, 4.432259062249502], [0.25043448591242407, 0.23827803939343753, 0.25177345333431034, 0.3255196536083328, 2.7530619541388135], [0.2801670934847254, 0.29285042676224704, 0.3255196536083333, 0.7305313185785416, 0.28337915320427376]]

overlaps  [[0, 0, 0, 1, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 1, 0, 0], [0, 0, 0, 0, 0, 0, 1, 0, 1], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 1, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]

alpha shape  (1780,)

alpha  [-5.71764858e-15 -9.60342916e-15 -4.44089210e-15 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.94606475e-01  2.61545522e-01  1.51662712e-01  6.40477201e-02
  2.76325687e-01  2.62140267e-01  6.62502007e-02  3.63582888e-01
 -1.19460594e-01  4.64381020e-02  1.85409382e-02  4.85509389e-02
  1.87485315e-01  2.32755619e-01  9.33665571e-02  1.48611690e-01
 -7.18839629e-02  2.64297012e-01  2.56953275e-01  2.92889556e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  1.11897868e-01  1.35558598e-01  1.70848022e-01  1.53970567e-01
  2.03748025e-01  1.48898814e-01  1.92303234e-01  1.89929565e-01
  1.43320385e-01  1.15468064e-01  1.34760828e-01  9.67047448e-02
  1.55213314e-01  1.92843440e-01  1.13805363e-01  1.26056127e-01
  7.87532901e-02  1.62900147e-01  1.49980791e-01  1.96710707e-01
  6.68848479e-02  1.69076763e-01  2.63261380e-01 -1.34212859e-01
 -1.43823703e-01  9.59279705e-02  1.14737274e-01  4.96591460e-01
 -2.18252586e-01 -2.93289775e-01  2.44839428e-01  3.10372718e-01
  1.55129046e-01  2.38384017e-01  5.27269979e-03  9.42964078e-01
  5.53093472e-02  1.27318774e-01  4.94251886e-01  1.36492500e-02
  1.99932140e-01  1.32380422e-01  9.02969476e-02  1.59180175e-01
  4.42580077e-02  2.15288882e-01  1.26389442e-01  1.27422268e-01
  1.61095309e-01  1.10956233e-01  2.01626655e-01  1.33624890e-01
  1.37982195e-01  1.93203762e-01  2.19491936e-01  1.42852423e-01
  2.34872603e-01  1.65588035e-02  2.03278438e-01  8.71290829e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02
  1.61649912e-01  2.95440858e-01  1.37755627e-01  1.31129624e-01
  1.72786645e-01  1.80805768e-01  2.62241486e-01  3.27874447e-01
  1.44536731e-01  2.33741542e-01  7.60294215e-03 -8.20285806e-02
  2.05080658e-02  7.18148304e-02 -1.11426149e-01  5.34113937e-02
  2.88166020e-01  1.90385902e-01  3.57708700e-01  1.37044433e-02
  2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01]

approx error on U on val data  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.23631374392237953, 0.24953571779089506, 0.27812678886179165, 0.231729063686265, 0.18605605800596603], [0.2359801995957806, 0.17454631451517882, 0.28965629609284577, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626528, 0.18521048307998722, 0.2418033779932797], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 0.27373550017628107], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.278126788861791, 0.23172906368626486, 0.1852247909918996, 0.23749716158177522, 0.23833652926205703], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 0.28016709348472546], [0.2359749362072568, 0.17454631451518027, 0.25208694637014994, 0.18579675075769767, 0.24953571779089426, 0.2317290636862655, 0.18521048307998772, 0.2418033779932817, 0.2383365292620568, 0.23279130874330228]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.30178257  0.2077346   0.32414433 -0.17171179  0.40158304  0.34495745
  0.2558376   0.47291217  0.25357882  0.28543644  0.2349977   0.01982751
  0.19669406  0.47885777 -0.0151618   0.31202035 -0.14873603  0.24792968
 -0.03238508 -0.08735458 -0.05653106  0.02158093  0.18286815  0.12250079
  0.29829939  0.29459577  0.17806963  0.04637235  0.22480426 -0.0127479
 -0.13475832  0.25039659  0.24759729  0.16160044  0.02867303  0.57580809
  0.22257379  0.06884512  0.2738885   0.01642497  0.34117488  0.42734282
  0.09605588  0.29089146  0.18764779  0.1431622  -0.19332824  0.47721753
  0.34247956  0.35895154  0.17282356  0.18488614  0.08990329  0.60758602
  0.03884765  0.26010542  0.25444516  0.49043454  0.43796135 -0.05618425
 -0.00630182  0.23497437  0.13828772  0.1341697   0.39923705 -0.04598166
  0.20446805  0.27883156  0.24350498  0.08163332  0.36885807  0.19675601
  0.14916785 -0.14253214  0.26702518  0.13308412  0.38134245  0.03937861
  0.50723299  0.41034508  0.1752889   0.21718581  0.21784     0.1338422
  0.07648416  0.21518768  0.25716559  0.42559352  0.24307837  0.07970533
  0.30988444 -0.35425808  0.11857624  0.27773432 -0.23988571  0.24540093
  0.1682742   0.08685204  0.2586056   0.09822901]

approx error on V on Val data  [[0.3005124651610089, 0.24023287340261276, 0.23399029277961225, 0.25178130044886216, 0.22510235601642412], [0.23279130874330747, 0.3255196536083326, 0.21561693928679454, 0.2281330790423822, 0.22882604559648972], [0.23279130874329965, 0.23322780833832102, 0.32551965360833346, 0.32449813576447795, 0.22510235601641848], [0.3255196536083328, 0.21561693928679473, 0.2363137439223792, 0.2529949237066617, 0.23827803939343753], [0.22813307904238087, 0.2746735597137788, 0.26428196002047094, 0.3255196536083333, 0.22330710034162826], [0.2731448354888367, 0.2313399246190735, 0.32551965360833324, 0.28631706390421147, 0.26699360191285193]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [01:08<00:00, 68.58s/it]
predicting: 100%|██████████| 1/1 [01:08<00:00, 68.58s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [01:13<04:54, 73.71s/it]
predicting:  40%|████      | 2/5 [02:20<03:29, 69.90s/it]
predicting:  60%|██████    | 3/5 [03:23<02:12, 66.35s/it]
predicting:  80%|████████  | 4/5 [04:41<01:11, 71.16s/it]
predicting: 100%|██████████| 5/5 [05:47<00:00, 69.23s/it]
predicting: 100%|██████████| 5/5 [05:47<00:00, 69.49s/it]

***********************************
S_worst_ind  3

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

AVG_LLM_loss_on_VAL_data  [0.15499999999999997, 0.15, 0.15499999999999997, 0.14499999999999996, 0.145, 0.13999999999999999, 0.13999999999999999]

MIN_LLM_loss_on_VAL_data  [0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.2, 0.25, 0.2, 0.2, 0.15, 0.15]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.2, 0.15, 0.15, 0.15, 0.15], [0.15, 0.25, 0.15, 0.15, 0.25], [0.15, 0.15, 0.25, 0.25, 0.15], [0.25, 0.15, 0.2, 0.2, 0.2], [0.15, 0.2, 0.25, 0.25, 0.15], [0.2, 0.15, 0.25, 0.2, 0.15], [0.2, 0.15, 0.2, 0.2, 0.2]]

AVG_LLM_loss_on_VAL_data  [0.16, 0.19, 0.19, 0.2, 0.2, 0.19, 0.19]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.2, 0.25, 0.25, 0.25, 0.25, 0.25, 0.2]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.94606475e-01  2.61545522e-01  1.51662712e-01  6.40477201e-02
  2.76325687e-01  2.62140267e-01  6.62502007e-02  3.63582888e-01
 -1.19460594e-01  4.64381020e-02  1.85409382e-02  4.85509389e-02
  1.87485315e-01  2.32755619e-01  9.33665571e-02  1.48611690e-01
 -7.18839629e-02  2.64297012e-01  2.56953275e-01  2.92889556e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  1.11897868e-01  1.35558598e-01  1.70848022e-01  1.53970567e-01
  2.03748025e-01  1.48898814e-01  1.92303234e-01  1.89929565e-01
  1.43320385e-01  1.15468064e-01  1.34760828e-01  9.67047448e-02
  1.55213314e-01  1.92843440e-01  1.13805363e-01  1.26056127e-01
  7.87532901e-02  1.62900147e-01  1.49980791e-01  1.96710707e-01
  1.99932140e-01  1.32380422e-01  9.02969476e-02  1.59180175e-01
  4.42580077e-02  2.15288882e-01  1.26389442e-01  1.27422268e-01
  1.61095309e-01  1.10956233e-01  2.01626655e-01  1.33624890e-01
  1.37982195e-01  1.93203762e-01  2.19491936e-01  1.42852423e-01
  2.34872603e-01  1.65588035e-02  2.03278438e-01  8.71290829e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02
  1.61649912e-01  2.95440858e-01  1.37755627e-01  1.31129624e-01
  1.72786645e-01  1.80805768e-01  2.62241486e-01  3.27874447e-01
  1.44536731e-01  2.33741542e-01  7.60294215e-03 -8.20285806e-02
  2.05080658e-02  7.18148304e-02 -1.11426149e-01  5.34113937e-02
  2.88166020e-01  1.90385902e-01  3.57708700e-01  1.37044433e-02
  2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
 -3.60704628e-01 -3.57136467e-01 -3.41428082e-01 -3.14154430e-01
 -3.11386718e-01 -3.52869585e-01 -3.10009219e-01 -3.35426265e-01
 -2.67868424e-01 -3.29149116e-01 -2.92927942e-01 -3.26378194e-01
 -3.15073133e-01 -3.01249750e-01 -3.33390132e-01 -2.88188696e-01
 -3.37796061e-01 -2.83215774e-01 -3.14976469e-01 -2.80225837e-01]

approx error on U for Validation Data after updating U  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.24953571779089506, 0.27812678886179165, 0.23172906368626495, 0.18605605800596603, 3.4120111785147245], [0.2359801995957806, 0.17454631451517882, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626522, 0.18521048307998722, 0.24180337799327972, 2.4155314944857715], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 1.1389804813710553], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.231729063686265, 0.1852247909918996, 0.23749716158177522, 0.2383365292620571, 3.850722734294883], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 1.1265321190341384], [0.2359749362072568, 0.17454631451518027, 0.25208694637014994, 0.24953571779089426, 0.23172906368626553, 0.18521048307998772, 0.24180337799328164, 0.2383365292620568, 0.2327913087433023, 0.46767774609366325]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.30178257  0.2077346   0.32414433 -0.17171179  0.40158304  0.34495745
  0.2558376   0.47291217  0.25357882  0.28543644  0.2349977   0.01982751
  0.19669406  0.47885777 -0.0151618   0.31202035 -0.14873603  0.24792968
 -0.03238508 -0.08735458 -0.00630182  0.23497437  0.13828772  0.1341697
  0.39923705 -0.04598166  0.20446805  0.27883156  0.24350498  0.08163332
  0.36885807  0.19675601  0.14916785 -0.14253214  0.26702518  0.13308412
  0.38134245  0.03937861  0.50723299  0.41034508  0.34117488  0.42734282
  0.09605588  0.29089146  0.18764779  0.1431622  -0.19332824  0.47721753
  0.34247956  0.35895154  0.17282356  0.18488614  0.08990329  0.60758602
  0.03884765  0.26010542  0.25444516  0.49043454  0.43796135 -0.05618425
  1.11152363  1.07202591  0.9793857   0.99959329  0.93456756  1.03885093
  0.97277164  1.04034898  0.86540629  1.05914044  0.89237209  1.0321424
  0.90443808  0.94266427  1.05598434  0.90180893  1.05534998  0.89774997
  0.9576291   0.77879302  1.80767819  1.86117301  1.96465711  1.76070578
  1.8253937   1.73859955  1.98490708  1.84844901  1.72535333  1.76588871
  1.6929883   1.77950107  1.8794641   1.82421178  1.79168982  1.66129766
  1.94495584  1.70089782  1.86712108  1.64079663]

approx error on V for Validation Data after updating V  [[0.23399029277961225, 0.3176322143187879, 0.9750657754550233, 0.8386835253310766, 0.2517734533343107], [0.23279130874330747, 0.2532403830116759, 0.3255196536083326, 0.3009350317044362, 2.9393271195314403], [0.32551965360833346, 0.975065775455022, 0.2888735743383605, 0.7886835253310979, 4.432259062249502], [0.25043448591242407, 0.23827803939343753, 0.25177345333431034, 0.3255196536083328, 2.7530619541388135], [0.2801670934847254, 0.29285042676224704, 0.3255196536083333, 0.7305313185785416, 0.28337915320427376], [0.3204360521311772, 0.2762407690224758, 0.3232414069233676, 0.7907449554635042, 1.603286478079267]]

overlaps  [[0, 0, 1, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 1], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 1, 0, 0, 0], [0, 0, 0, 0, 0, 1, 0, 1, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 1, 0, 0], [0, 0, 1, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.1111111111111111]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668, 0.08888888888888888]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222, 0.2222222222222222]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]

alpha shape  (1780,)

alpha  [-1.99562589e-14  1.32671651e-14  1.59872116e-14 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.94591731e-01  2.61560157e-01  1.51661953e-01  6.40345465e-02
  2.76325135e-01  2.62136882e-01  6.62495283e-02  3.63565510e-01
 -1.19467639e-01  4.64048342e-02  1.85511030e-02  4.85333596e-02
  1.87508866e-01  2.32766458e-01  9.33684338e-02  1.48618944e-01
 -7.18743700e-02  2.64300307e-01  2.56986187e-01  2.92863849e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  1.05163238e-01  1.30752893e-01  1.66914738e-01  1.58458437e-01
  2.10286993e-01  1.39681302e-01  1.95391418e-01  1.83117226e-01
  1.49206771e-01  1.11355663e-01  1.38567163e-01  9.07140648e-02
  1.58524273e-01  2.01934211e-01  1.09717589e-01  1.30727300e-01
  7.38902386e-02  1.71259543e-01  1.53996398e-01  1.95320833e-01
  1.91553268e-01  1.36575285e-01  9.76845880e-02  1.57780123e-01
  5.51407073e-02  2.13932963e-01  1.26686910e-01  1.23756964e-01
  1.63395042e-01  1.04804915e-01  2.00383712e-01  1.33552089e-01
  1.40736344e-01  1.83518388e-01  2.14289925e-01  1.43723748e-01
  2.29341623e-01  2.36668683e-02  2.01305088e-01  9.66297977e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02
  1.36065995e-01  2.94296415e-01  1.46044355e-01  1.36119324e-01
  1.71544392e-01  2.01104157e-01  2.69113165e-01  2.97929958e-01
  1.67438503e-01  2.29002921e-01  8.62767988e-03 -8.22253166e-02
  2.96285613e-02  6.37406556e-02 -1.23816117e-01  7.12625019e-02
  2.96861336e-01  1.81680020e-01  3.72536443e-01 -1.16406185e-02
  2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
 -8.18964730e-03  2.23412999e-01  8.40408620e-02  6.66026354e-02
  2.01848981e-01  7.66348409e-02  3.52346536e-01  2.30140100e-01
 -3.49897168e-02  3.46138069e-01 -2.06833112e-01  1.60886773e-01
  9.90572821e-02 -1.22816796e-02  2.37940941e-01  3.36168545e-01
  3.69262779e-01  5.23195853e-02  3.83718621e-01 -8.17969926e-02]

approx error on U on val data  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.23631374392237953, 0.24953571779089506, 0.27812678886179165, 0.231729063686265, 0.18605605800596603], [0.2359801995957806, 0.17454631451517882, 0.28965629609284577, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626528, 0.18521048307998722, 0.2418033779932797], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 0.27373550017628107], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.278126788861791, 0.23172906368626486, 0.1852247909918996, 0.23749716158177522, 0.23833652926205703], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 0.28016709348472546], [0.2359749362072568, 0.17454631451518027, 0.25208694637014994, 0.18579675075769767, 0.24953571779089426, 0.2317290636862655, 0.18521048307998772, 0.2418033779932817, 0.2383365292620568, 0.23279130874330228], [0.2359725523240858, 0.17454631451517838, 0.2519649221737047, 0.2500443374202882, 0.2317290636862642, 0.18521048307998828, 0.2418033779932811, 0.24036103152164529, 0.23279130874330162, 0.2332278083383207]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.259662    0.04567583  0.08765717  0.05515772  0.28001329  0.30178664
  0.1463392   0.47548672  0.28816229  0.00769198  0.35052272 -0.02494032
  0.28798261  0.34241329 -0.09929804  0.30570091  0.03374878  0.31141017
  0.13154975  0.26461557  0.0227302   0.24707728  0.06306188  0.07726536
  0.12256577  0.09544146  0.17253713  0.2634071   0.14161013  0.09335936
  0.19201714  0.08434915  0.04640256  0.00762412  0.05869234  0.05408016
  0.35237209  0.13895577  0.40930624  0.27634713  0.2178318   0.45997265
  0.15138922  0.1469165   0.09380322  0.15916252 -0.0822101   0.2211085
  0.15744771  0.40596218  0.07920417  0.1415419   0.10291984  0.50191308
  0.09309766  0.29915005  0.25635701  0.26260415  0.37732573 -0.15893388
  0.01014737  0.37786044  0.37147432  0.2467703   0.14906098  0.32893415
  0.46417228  0.33897887  0.27707923  0.26593893 -0.08454528 -0.20853256
  0.02606396 -0.08297027  0.05086834  0.40982516  0.35507794 -0.15330854
  0.69482716  0.08981467 -0.04280563  0.25072471  0.16099535 -0.0275294
  0.06962058  0.29862325  0.57880482  0.66781206  0.19325178  0.30244826
  0.01198914 -0.13507077  0.07981793  0.36042357 -0.18134056  0.13804117
  0.31059073  0.0961869   0.56149882  0.24745475]

approx error on V on Val data  [[0.3005124651610089, 0.24023287340261276, 0.23399029277961225, 0.25178130044886216, 0.22510235601642412], [0.23279130874330747, 0.3255196536083326, 0.21561693928679454, 0.2281330790423822, 0.22882604559648972], [0.23279130874329965, 0.23322780833832102, 0.32551965360833346, 0.32449813576447795, 0.22510235601641848], [0.3255196536083328, 0.21561693928679473, 0.2363137439223792, 0.2529949237066617, 0.23827803939343753], [0.22813307904238087, 0.2746735597137788, 0.26428196002047094, 0.3255196536083333, 0.22330710034162826], [0.2731448354888367, 0.2313399246190735, 0.32551965360833324, 0.28631706390421147, 0.26699360191285193], [0.2857164443567357, 0.22328076862823512, 0.29122338458898833, 0.26780198579212855, 0.26297394789383066]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [01:04<00:00, 64.86s/it]
predicting: 100%|██████████| 1/1 [01:04<00:00, 64.86s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [01:14<04:56, 74.22s/it]
predicting:  40%|████      | 2/5 [02:36<03:56, 78.83s/it]
predicting:  60%|██████    | 3/5 [03:30<02:15, 67.63s/it]
predicting:  80%|████████  | 4/5 [04:35<01:06, 66.67s/it]
predicting: 100%|██████████| 5/5 [05:46<00:00, 68.18s/it]
predicting: 100%|██████████| 5/5 [05:46<00:00, 69.33s/it]

***********************************
S_worst_ind  2

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

AVG_LLM_loss_on_VAL_data  [0.15499999999999997, 0.15, 0.15499999999999997, 0.14499999999999996, 0.145, 0.13999999999999999, 0.13999999999999999, 0.13999999999999999]

MIN_LLM_loss_on_VAL_data  [0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.2, 0.25, 0.2, 0.2, 0.15, 0.15, 0.15]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.2, 0.15, 0.15, 0.15, 0.15], [0.15, 0.25, 0.15, 0.15, 0.25], [0.15, 0.15, 0.25, 0.25, 0.15], [0.25, 0.15, 0.2, 0.2, 0.2], [0.15, 0.2, 0.25, 0.25, 0.15], [0.2, 0.15, 0.25, 0.2, 0.15], [0.2, 0.15, 0.2, 0.2, 0.2], [0.25, 0.15, 0.2, 0.25, 0.15]]

AVG_LLM_loss_on_VAL_data  [0.16, 0.19, 0.19, 0.2, 0.2, 0.19, 0.19, 0.2]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15]

MAX_LLM_loss_on_VAL_data  [0.2, 0.25, 0.25, 0.25, 0.25, 0.25, 0.2, 0.25]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.94591731e-01  2.61560157e-01  1.51661953e-01  6.40345465e-02
  2.76325135e-01  2.62136882e-01  6.62495283e-02  3.63565510e-01
 -1.19467639e-01  4.64048342e-02  1.85511030e-02  4.85333596e-02
  1.87508866e-01  2.32766458e-01  9.33684338e-02  1.48618944e-01
 -7.18743700e-02  2.64300307e-01  2.56986187e-01  2.92863849e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  1.91553268e-01  1.36575285e-01  9.76845880e-02  1.57780123e-01
  5.51407073e-02  2.13932963e-01  1.26686910e-01  1.23756964e-01
  1.63395042e-01  1.04804915e-01  2.00383712e-01  1.33552089e-01
  1.40736344e-01  1.83518388e-01  2.14289925e-01  1.43723748e-01
  2.29341623e-01  2.36668683e-02  2.01305088e-01  9.66297977e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02
  1.36065995e-01  2.94296415e-01  1.46044355e-01  1.36119324e-01
  1.71544392e-01  2.01104157e-01  2.69113165e-01  2.97929958e-01
  1.67438503e-01  2.29002921e-01  8.62767988e-03 -8.22253166e-02
  2.96285613e-02  6.37406556e-02 -1.23816117e-01  7.12625019e-02
  2.96861336e-01  1.81680020e-01  3.72536443e-01 -1.16406185e-02
  2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
 -8.18964730e-03  2.23412999e-01  8.40408620e-02  6.66026354e-02
  2.01848981e-01  7.66348409e-02  3.52346536e-01  2.30140100e-01
 -3.49897168e-02  3.46138069e-01 -2.06833112e-01  1.60886773e-01
  9.90572821e-02 -1.22816796e-02  2.37940941e-01  3.36168545e-01
  3.69262779e-01  5.23195853e-02  3.83718621e-01 -8.17969926e-02
 -1.48772323e+00 -1.28213333e+00 -1.26140671e+00 -1.48657273e+00
 -1.32888452e+00 -1.28734655e+00 -1.25055180e+00 -1.38996915e+00
 -1.26522962e+00 -1.26927594e+00 -1.26238953e+00 -1.34570906e+00
 -1.33462401e+00 -1.22408999e+00 -1.32220737e+00 -1.12254509e+00
 -1.29642515e+00 -1.43927379e+00 -1.15575327e+00 -1.30811479e+00]

approx error on U for Validation Data after updating U  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.24953571779089506, 0.27812678886179165, 0.23172906368626495, 0.18605605800596603, 3.4120111785147245], [0.2359801995957806, 0.17454631451517882, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626522, 0.18521048307998722, 0.24180337799327972, 2.4155314944857715], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 1.1389804813710553], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.231729063686265, 0.1852247909918996, 0.23749716158177522, 0.2383365292620571, 3.850722734294883], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 1.1265321190341384], [0.2359749362072568, 0.17454631451518027, 0.25208694637014994, 0.24953571779089426, 0.23172906368626553, 0.18521048307998772, 0.24180337799328164, 0.2383365292620568, 0.2327913087433023, 0.46767774609366325], [0.2359725523240858, 0.17454631451517838, 0.2500443374202882, 0.23172906368626425, 0.18521048307998828, 0.2418033779932811, 0.24036103152164529, 0.2327913087433017, 0.2332278083383207, 1.456011281346867]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.2178318   0.45997265  0.15138922  0.1469165   0.09380322  0.15916252
 -0.0822101   0.2211085   0.15744771  0.40596218  0.07920417  0.1415419
  0.10291984  0.50191308  0.09309766  0.29915005  0.25635701  0.26260415
  0.37732573 -0.15893388  0.01014737  0.37786044  0.37147432  0.2467703
  0.14906098  0.32893415  0.46417228  0.33897887  0.27707923  0.26593893
 -0.08454528 -0.20853256  0.02606396 -0.08297027  0.05086834  0.40982516
  0.35507794 -0.15330854  0.69482716  0.08981467  1.10853871  1.13400391
  1.13889491  1.03670305  1.01431902  1.0088266   1.04778899  1.06341765
  0.91183423  1.11350805  0.92913621  1.07172324  1.15827014  1.10138636
  1.07112127  1.00524496  1.11075922  0.99836394  1.00020976  0.79554069
 -0.04280563  0.25072471  0.16099535 -0.0275294   0.06962058  0.29862325
  0.57880482  0.66781206  0.19325178  0.30244826  0.01198914 -0.13507077
  0.07981793  0.36042357 -0.18134056  0.13804117  0.31059073  0.0961869
  0.56149882  0.24745475  2.07811848  2.12563612  2.02308935  2.200567
  2.17167072  2.13273193  2.05280593  2.17156061  2.08957019  2.07384118
  2.27568025  1.96837862  2.09636846  2.09428498  2.05395937  1.89072938
  2.29112856  2.04757081  2.14257756  1.93336804]

approx error on V for Validation Data after updating V  [[0.23399029277961225, 0.3176322143187879, 0.9750657754550233, 0.8386835253310766, 0.2517734533343107], [0.23279130874330747, 0.2532403830116759, 0.3255196536083326, 0.3009350317044362, 2.9393271195314403], [0.32551965360833346, 0.975065775455022, 0.2888735743383605, 0.7886835253310979, 4.432259062249502], [0.25043448591242407, 0.23827803939343753, 0.25177345333431034, 0.3255196536083328, 2.7530619541388135], [0.2801670934847254, 0.29285042676224704, 0.3255196536083333, 0.7305313185785416, 0.28337915320427376], [0.3204360521311772, 0.2762407690224758, 0.3232414069233676, 0.7907449554635042, 1.603286478079267], [0.3191125347781929, 0.2549494177169502, 0.840979544853828, 0.29687441265243936, 1.9456818784130252]]

overlaps  [[0, 1, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 1, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 1, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.0, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668, 0.08888888888888888, 0.06666666666666667]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222, 0.2222222222222222, 0.2222222222222222]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]

alpha shape  (1780,)

alpha  [-2.12330153e-14  9.65894031e-15  1.64313008e-14 ...  0.00000000e+00
  0.00000000e+00  0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.94637710e-01  2.61514517e-01  1.51664320e-01  6.40756279e-02
  2.76326854e-01  2.62147438e-01  6.62516252e-02  3.63619701e-01
 -1.19445669e-01  4.65085785e-02  1.85194044e-02  4.85881799e-02
  1.87435424e-01  2.32732655e-01  9.33625813e-02  1.48596323e-01
 -7.19042853e-02  2.64290033e-01  2.56883552e-01  2.92944014e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  2.17682486e-01  1.23493751e-01  7.46464901e-02  1.62146135e-01
  2.12033893e-02  2.18161349e-01  1.25759267e-01  1.35187086e-01
  1.56223406e-01  1.23987585e-01  2.04259786e-01  1.33779117e-01
  1.32147626e-01  2.13721888e-01  2.30512214e-01  1.41006553e-01
  2.46589790e-01  1.50061680e-03  2.07458911e-01  6.70021511e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02
  1.09867238e-01  2.93124468e-01  1.54532281e-01  1.41228938e-01
  1.70272285e-01  2.21890362e-01  2.76149986e-01  2.67265835e-01
  1.90890658e-01  2.24150421e-01  9.67704440e-03 -8.24267807e-02
  3.89682433e-02  5.54724399e-02 -1.36503846e-01  8.95426131e-02
  3.05765620e-01  1.72764916e-01  3.87720530e-01 -3.75947804e-02
  2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
 -1.58904613e-02  2.19146072e-01  7.73234483e-02  7.03851745e-02
  2.04481197e-01  6.72933538e-02  3.58986724e-01  2.28395156e-01
 -3.05193582e-02  3.49541240e-01 -2.03378476e-01  1.57137180e-01
  9.85550806e-02 -8.14314649e-03  2.32877980e-01  3.37743024e-01
  3.70408064e-01  6.47419830e-02  3.87016141e-01 -9.02473108e-02
  2.36687348e-01  2.62498923e-01  1.40992562e-01  1.18117236e-01
  4.30786422e-02  4.42910218e-01  2.03406707e-01  4.33966579e-01
  1.13336995e-01  1.22960665e-01  1.35070333e-01  2.20201307e-02
 -2.78353973e-01  1.53385951e-01 -7.66627575e-02  4.22836193e-01
  7.10941409e-02 -2.52636088e-01  3.75802785e-01  2.57285650e-01]

approx error on U on val data  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.23631374392237953, 0.24953571779089506, 0.27812678886179165, 0.231729063686265, 0.18605605800596603], [0.2359801995957806, 0.17454631451517882, 0.28965629609284577, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626528, 0.18521048307998722, 0.2418033779932797], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 0.27373550017628107], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.278126788861791, 0.23172906368626486, 0.1852247909918996, 0.23749716158177522, 0.23833652926205703], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 0.28016709348472546], [0.2359749362072568, 0.17454631451518027, 0.25208694637014994, 0.18579675075769767, 0.24953571779089426, 0.2317290636862655, 0.18521048307998772, 0.2418033779932817, 0.2383365292620568, 0.23279130874330228], [0.2359725523240858, 0.17454631451517838, 0.2519649221737047, 0.2500443374202882, 0.2317290636862642, 0.18521048307998828, 0.2418033779932811, 0.24036103152164529, 0.23279130874330162, 0.2332278083383207], [0.23597998638034925, 0.17454631451517122, 0.24845822490153918, 0.2317290636862653, 0.18521048307998697, 0.2418033779932828, 0.24383756647630053, 0.23279130874330223, 0.23329509638775076, 0.23489463822875897]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.34117488  0.42734282  0.09605588  0.29089146  0.18764779  0.1431622
 -0.19332824  0.47721753  0.34247956  0.35895154  0.17282356  0.18488614
  0.08990329  0.60758602  0.03884765  0.26010542  0.25444516  0.49043454
  0.43796135 -0.05618425  0.03166351  0.26414555  0.22135024  0.14621076
  0.10409631  0.32171246  0.41090232  0.30849574  0.30852786  0.0984897
 -0.01209993 -0.16683398  0.01057994 -0.09408899  0.01311367  0.37326327
  0.271206   -0.27649742  0.57318925  0.01248954  0.06331129  0.10572165
  0.12380029  0.18779745  0.38705589  0.26724899  0.15328324  0.13543949
  0.21011588  0.17447044  0.06731854  0.38668137  0.02978893  0.10233048
  0.26537684  0.42515781  0.37513867  0.03199323  0.41736248  0.12824468
 -0.02347617  0.28643874  0.29559056  0.08355111  0.10907579  0.2938082
  0.64441615  0.60000506  0.19214023  0.38635186 -0.0206543  -0.00954859
  0.19342657  0.39072292 -0.06916515  0.23288555  0.38520336  0.13157467
  0.58036772  0.29612978  0.2581109   0.30175009  0.30991764  0.09532458
  0.14673104  0.208457    0.30541767  0.47264231  0.08228682  0.16411027
  0.07070567 -0.22647056  0.1126128   0.23212696 -0.19981573  0.27925011
  0.06456311  0.07809708  0.23614422 -0.05546952]

approx error on V on Val data  [[0.3005124651610089, 0.24023287340261276, 0.23399029277961225, 0.25178130044886216, 0.22510235601642412], [0.23279130874330747, 0.3255196536083326, 0.21561693928679454, 0.2281330790423822, 0.22882604559648972], [0.23279130874329965, 0.23322780833832102, 0.32551965360833346, 0.32449813576447795, 0.22510235601641848], [0.3255196536083328, 0.21561693928679473, 0.2363137439223792, 0.2529949237066617, 0.23827803939343753], [0.22813307904238087, 0.2746735597137788, 0.26428196002047094, 0.3255196536083333, 0.22330710034162826], [0.2731448354888367, 0.2313399246190735, 0.32551965360833324, 0.28631706390421147, 0.26699360191285193], [0.2857164443567357, 0.22328076862823512, 0.29122338458898833, 0.26780198579212855, 0.26297394789383066], [0.3255196536083329, 0.22545299506733402, 0.2866388595737954, 0.3012694433198483, 0.24619654058787127]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [01:08<00:00, 68.61s/it]
predicting: 100%|██████████| 1/1 [01:08<00:00, 68.61s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [01:03<04:13, 63.34s/it]
predicting:  40%|████      | 2/5 [01:58<02:56, 58.75s/it]
predicting:  60%|██████    | 3/5 [03:04<02:04, 62.08s/it]
predicting:  80%|████████  | 4/5 [04:12<01:04, 64.32s/it]
predicting: 100%|██████████| 5/5 [05:18<00:00, 64.84s/it]
predicting: 100%|██████████| 5/5 [05:18<00:00, 63.69s/it]

***********************************
S_worst_ind  9

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1]

AVG_LLM_loss_on_VAL_data  [0.15499999999999997, 0.15, 0.15499999999999997, 0.14499999999999996, 0.145, 0.13999999999999999, 0.13999999999999999, 0.13999999999999999, 0.155]

MIN_LLM_loss_on_VAL_data  [0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.2, 0.25, 0.2, 0.2, 0.15, 0.15, 0.15, 0.3]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.2, 0.15, 0.15, 0.15, 0.15], [0.15, 0.25, 0.15, 0.15, 0.25], [0.15, 0.15, 0.25, 0.25, 0.15], [0.25, 0.15, 0.2, 0.2, 0.2], [0.15, 0.2, 0.25, 0.25, 0.15], [0.2, 0.15, 0.25, 0.2, 0.15], [0.2, 0.15, 0.2, 0.2, 0.2], [0.25, 0.15, 0.2, 0.25, 0.15], [0.1, 0.3, 0.2, 0.2, 0.2]]

AVG_LLM_loss_on_VAL_data  [0.16, 0.19, 0.19, 0.2, 0.2, 0.19, 0.19, 0.2, 0.2]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.25, 0.25, 0.25, 0.25, 0.25, 0.2, 0.25, 0.3]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1]

approximation 
 [ 2.94637710e-01  2.61514517e-01  1.51664320e-01  6.40756279e-02
  2.76326854e-01  2.62147438e-01  6.62516252e-02  3.63619701e-01
 -1.19445669e-01  4.65085785e-02  1.85194044e-02  4.85881799e-02
  1.87435424e-01  2.32732655e-01  9.33625813e-02  1.48596323e-01
 -7.19042853e-02  2.64290033e-01  2.56883552e-01  2.92944014e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  2.17682486e-01  1.23493751e-01  7.46464901e-02  1.62146135e-01
  2.12033893e-02  2.18161349e-01  1.25759267e-01  1.35187086e-01
  1.56223406e-01  1.23987585e-01  2.04259786e-01  1.33779117e-01
  1.32147626e-01  2.13721888e-01  2.30512214e-01  1.41006553e-01
  2.46589790e-01  1.50061680e-03  2.07458911e-01  6.70021511e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02
  1.09867238e-01  2.93124468e-01  1.54532281e-01  1.41228938e-01
  1.70272285e-01  2.21890362e-01  2.76149986e-01  2.67265835e-01
  1.90890658e-01  2.24150421e-01  9.67704440e-03 -8.24267807e-02
  3.89682433e-02  5.54724399e-02 -1.36503846e-01  8.95426131e-02
  3.05765620e-01  1.72764916e-01  3.87720530e-01 -3.75947804e-02
  2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
 -1.58904613e-02  2.19146072e-01  7.73234483e-02  7.03851745e-02
  2.04481197e-01  6.72933538e-02  3.58986724e-01  2.28395156e-01
 -3.05193582e-02  3.49541240e-01 -2.03378476e-01  1.57137180e-01
  9.85550806e-02 -8.14314649e-03  2.32877980e-01  3.37743024e-01
  3.70408064e-01  6.47419830e-02  3.87016141e-01 -9.02473108e-02
 -1.94093151e+00 -1.98546062e+00 -1.96480617e+00 -1.88240076e+00
 -1.84662599e+00 -1.82951823e+00 -1.85773032e+00 -1.90849081e+00
 -1.69596136e+00 -1.94607716e+00 -1.76644734e+00 -1.86582070e+00
 -2.00839593e+00 -1.93614345e+00 -1.88744460e+00 -1.76170998e+00
 -1.99944124e+00 -1.79405601e+00 -1.82123087e+00 -1.50850896e+00]

approx error on U for Validation Data after updating U  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.24953571779089506, 0.27812678886179165, 0.23172906368626495, 0.18605605800596603, 3.4120111785147245], [0.2359801995957806, 0.17454631451517882, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626522, 0.18521048307998722, 0.24180337799327972, 2.4155314944857715], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 1.1389804813710553], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.231729063686265, 0.1852247909918996, 0.23749716158177522, 0.2383365292620571, 3.850722734294883], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 1.1265321190341384], [0.2359749362072568, 0.17454631451518027, 0.25208694637014994, 0.24953571779089426, 0.23172906368626553, 0.18521048307998772, 0.24180337799328164, 0.2383365292620568, 0.2327913087433023, 0.46767774609366325], [0.2359725523240858, 0.17454631451517838, 0.2500443374202882, 0.23172906368626425, 0.18521048307998828, 0.2418033779932811, 0.24036103152164529, 0.2327913087433017, 0.2332278083383207, 1.456011281346867], [0.23597998638034925, 0.17454631451517122, 0.24845822490153918, 0.2317290636862653, 0.18521048307998697, 0.2418033779932828, 0.24383756647630053, 0.23279130874330223, 0.23329509638775076, 2.1603600993890644]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.23668735  0.26249892  0.14099256  0.11811724  0.04307864  0.44291022
  0.20340671  0.43396658  0.113337    0.12296067  0.13507033  0.02202013
 -0.27835397  0.15338595 -0.07666276  0.42283619  0.07109414 -0.25263609
  0.37580279  0.25728565  0.06331129  0.10572165  0.12380029  0.18779745
  0.38705589  0.26724899  0.15328324  0.13543949  0.21011588  0.17447044
  0.06731854  0.38668137  0.02978893  0.10233048  0.26537684  0.42515781
  0.37513867  0.03199323  0.41736248  0.12824468  0.34117488  0.42734282
  0.09605588  0.29089146  0.18764779  0.1431622  -0.19332824  0.47721753
  0.34247956  0.35895154  0.17282356  0.18488614  0.08990329  0.60758602
  0.03884765  0.26010542  0.25444516  0.49043454  0.43796135 -0.05618425
  2.36172018  2.41313898  2.53360377  2.2871292   2.37159818  2.24540936
  2.55222763  2.40603443  2.21696221  2.30456346  2.20307912  2.31268147
  2.42625102  2.3683775   2.34523592  2.13889439  2.51294487  2.21207489
  2.39882629  2.13634793 -0.02347617  0.28643874  0.29559056  0.08355111
  0.10907579  0.2938082   0.64441615  0.60000506  0.19214023  0.38635186
 -0.0206543  -0.00954859  0.19342657  0.39072292 -0.06916515  0.23288555
  0.38520336  0.13157467  0.58036772  0.29612978]

approx error on V for Validation Data after updating V  [[0.23399029277961225, 0.3176322143187879, 0.9750657754550233, 0.8386835253310766, 0.2517734533343107], [0.23279130874330747, 0.2532403830116759, 0.3255196536083326, 0.3009350317044362, 2.9393271195314403], [0.32551965360833346, 0.975065775455022, 0.2888735743383605, 0.7886835253310979, 4.432259062249502], [0.25043448591242407, 0.23827803939343753, 0.25177345333431034, 0.3255196536083328, 2.7530619541388135], [0.2801670934847254, 0.29285042676224704, 0.3255196536083333, 0.7305313185785416, 0.28337915320427376], [0.3204360521311772, 0.2762407690224758, 0.3232414069233676, 0.7907449554635042, 1.603286478079267], [0.3191125347781929, 0.2549494177169502, 0.840979544853828, 0.29687441265243936, 1.9456818784130252], [0.222474916776272, 0.3212083717908041, 0.32324140692336656, 2.1373550407327784, 0.2903417350929703]]

overlaps  [[0, 1, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 1, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 1, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.0, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668, 0.08888888888888888, 0.06666666666666667, 0.06666666666666667]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222, 0.2222222222222222, 0.2222222222222222, 0.2222222222222222]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]

alpha shape  (1780,)

alpha  [5.10702591e-15 1.21569421e-14 4.44089210e-16 ... 0.00000000e+00
 0.00000000e+00 0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1]

approximation 
 [ 2.94606475e-01  2.61545522e-01  1.51662712e-01  6.40477201e-02
  2.76325687e-01  2.62140267e-01  6.62502007e-02  3.63582888e-01
 -1.19460594e-01  4.64381020e-02  1.85409382e-02  4.85509389e-02
  1.87485315e-01  2.32755619e-01  9.33665571e-02  1.48611690e-01
 -7.18839629e-02  2.64297012e-01  2.56953275e-01  2.92889556e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  1.99932140e-01  1.32380422e-01  9.02969476e-02  1.59180175e-01
  4.42580077e-02  2.15288882e-01  1.26389442e-01  1.27422268e-01
  1.61095309e-01  1.10956233e-01  2.01626655e-01  1.33624890e-01
  1.37982195e-01  1.93203762e-01  2.19491936e-01  1.42852423e-01
  2.34872603e-01  1.65588035e-02  2.03278438e-01  8.71290829e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02
  8.68406395e-02  2.92094422e-01  1.61992484e-01  1.45719877e-01
  1.69154205e-01  2.40159763e-01  2.82334785e-01  2.40314539e-01
  2.11503214e-01  2.19885463e-01  1.05993513e-02 -8.26038514e-02
  4.71770719e-02  4.82053440e-02 -1.47655337e-01  1.05609359e-01
  3.13591769e-01  1.64929257e-01  4.01066121e-01 -6.04063991e-02
  2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
 -1.58904613e-02  2.19146072e-01  7.73234483e-02  7.03851745e-02
  2.04481197e-01  6.72933538e-02  3.58986724e-01  2.28395156e-01
 -3.05193582e-02  3.49541240e-01 -2.03378476e-01  1.57137180e-01
  9.85550806e-02 -8.14314649e-03  2.32877980e-01  3.37743024e-01
  3.70408064e-01  6.47419830e-02  3.87016141e-01 -9.02473108e-02
  2.95027956e-01  4.74760115e-02  5.95384013e-02  1.44670914e-01
  5.32978118e-01  4.67318392e-01  2.18415174e-01  6.83687770e-01
  5.10346117e-01  3.65324210e-02  7.08931180e-01 -2.36594950e-02
  2.56780249e-01  3.72968477e-01  5.40114726e-03  3.02899266e-01
  1.67266316e-01  3.41799903e-01  2.38860644e-01  4.95118497e-01]

approx error on U on val data  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.23631374392237953, 0.24953571779089506, 0.27812678886179165, 0.231729063686265, 0.18605605800596603], [0.2359801995957806, 0.17454631451517882, 0.28965629609284577, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626528, 0.18521048307998722, 0.2418033779932797], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 0.27373550017628107], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.278126788861791, 0.23172906368626486, 0.1852247909918996, 0.23749716158177522, 0.23833652926205703], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 0.28016709348472546], [0.2359749362072568, 0.17454631451518027, 0.25208694637014994, 0.18579675075769767, 0.24953571779089426, 0.2317290636862655, 0.18521048307998772, 0.2418033779932817, 0.2383365292620568, 0.23279130874330228], [0.2359725523240858, 0.17454631451517838, 0.2519649221737047, 0.2500443374202882, 0.2317290636862642, 0.18521048307998828, 0.2418033779932811, 0.24036103152164529, 0.23279130874330162, 0.2332278083383207], [0.23597998638034925, 0.17454631451517122, 0.24845822490153918, 0.2317290636862653, 0.18521048307998697, 0.2418033779932828, 0.24383756647630053, 0.23279130874330223, 0.23329509638775076, 0.23489463822875897], [0.2359749362072569, 0.17454631451517943, 0.24953571779089279, 0.23172906368626509, 0.1852104830799876, 0.2418033779932828, 0.24689316079916757, 0.2327913087433004, 0.23329509638775003, 0.3328325452468892]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 0.23415855  0.15419986  0.04664423  0.13454517  0.00985721  0.3181714
  0.12976751  0.3734738   0.04165943  0.06085903  0.07687661  0.01917279
 -0.25157241  0.07343584 -0.06594938  0.27778827 -0.01380503 -0.13784254
  0.21730053  0.25767989  0.05557521  0.28784423  0.24731422  0.29747335
  0.53350028  0.73941466  0.34589451  0.24934195  0.53199793  0.13962667
 -0.17612144  0.55715439  0.21669607  0.06921653  0.10819931  0.70274103
  0.33392581  0.1150315   0.59424107  0.13184625  0.2178318   0.45997265
  0.15138922  0.1469165   0.09380322  0.15916252 -0.0822101   0.2211085
  0.15744771  0.40596218  0.07920417  0.1415419   0.10291984  0.50191308
  0.09309766  0.29915005  0.25635701  0.26260415  0.37732573 -0.15893388
 -0.0127742   0.23284913  0.14382819  0.12529475  0.39347797 -0.04499912
  0.21542455  0.27947885  0.25202476  0.06865941  0.3680606   0.19134677
  0.15213241 -0.14348991  0.25898932  0.13409774  0.3892677   0.03415976
  0.52174964  0.41721824 -0.03729875  0.22694797  0.25702875  0.06512692
  0.08580685  0.25066532  0.53072375  0.44152177  0.19465498  0.26993586
 -0.0155665  -0.00519988  0.17494359  0.29656209 -0.08146758  0.21263455
  0.31551386  0.09647894  0.48640092  0.21122139]

approx error on V on Val data  [[0.3005124651610089, 0.24023287340261276, 0.23399029277961225, 0.25178130044886216, 0.22510235601642412], [0.23279130874330747, 0.3255196536083326, 0.21561693928679454, 0.2281330790423822, 0.22882604559648972], [0.23279130874329965, 0.23322780833832102, 0.32551965360833346, 0.32449813576447795, 0.22510235601641848], [0.3255196536083328, 0.21561693928679473, 0.2363137439223792, 0.2529949237066617, 0.23827803939343753], [0.22813307904238087, 0.2746735597137788, 0.26428196002047094, 0.3255196536083333, 0.22330710034162826], [0.2731448354888367, 0.2313399246190735, 0.32551965360833324, 0.28631706390421147, 0.26699360191285193], [0.2857164443567357, 0.22328076862823512, 0.29122338458898833, 0.26780198579212855, 0.26297394789383066], [0.3255196536083329, 0.22545299506733402, 0.2866388595737954, 0.3012694433198483, 0.24619654058787127], [0.17961176674021845, 0.3257712538291143, 0.29122338458898794, 0.286085730460546, 0.2730264128375919]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [01:00<00:00, 60.18s/it]
predicting: 100%|██████████| 1/1 [01:00<00:00, 60.18s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [01:03<04:15, 63.79s/it]
predicting:  40%|████      | 2/5 [02:09<03:13, 64.64s/it]
predicting:  60%|██████    | 3/5 [03:00<01:57, 58.56s/it]
predicting:  80%|████████  | 4/5 [04:12<01:03, 63.79s/it]
predicting: 100%|██████████| 5/5 [05:25<00:00, 67.16s/it]
predicting: 100%|██████████| 5/5 [05:25<00:00, 65.06s/it]

***********************************
S_worst_ind  9

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

AVG_LLM_loss_on_VAL_data  [0.15499999999999997, 0.15, 0.15499999999999997, 0.14499999999999996, 0.145, 0.13999999999999999, 0.13999999999999999, 0.13999999999999999, 0.155, 0.13999999999999999]

MIN_LLM_loss_on_VAL_data  [0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.2, 0.25, 0.2, 0.2, 0.15, 0.15, 0.15, 0.3, 0.15]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.2, 0.15, 0.15, 0.15, 0.15], [0.15, 0.25, 0.15, 0.15, 0.25], [0.15, 0.15, 0.25, 0.25, 0.15], [0.25, 0.15, 0.2, 0.2, 0.2], [0.15, 0.2, 0.25, 0.25, 0.15], [0.2, 0.15, 0.25, 0.2, 0.15], [0.2, 0.15, 0.2, 0.2, 0.2], [0.25, 0.15, 0.2, 0.25, 0.15], [0.1, 0.3, 0.2, 0.2, 0.2], [0.2, 0.25, 0.2, 0.15, 0.1]]

AVG_LLM_loss_on_VAL_data  [0.16, 0.19, 0.19, 0.2, 0.2, 0.19, 0.19, 0.2, 0.2, 0.18]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.25, 0.25, 0.25, 0.25, 0.25, 0.2, 0.25, 0.3, 0.25]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.94606475e-01  2.61545522e-01  1.51662712e-01  6.40477201e-02
  2.76325687e-01  2.62140267e-01  6.62502007e-02  3.63582888e-01
 -1.19460594e-01  4.64381020e-02  1.85409382e-02  4.85509389e-02
  1.87485315e-01  2.32755619e-01  9.33665571e-02  1.48611690e-01
 -7.18839629e-02  2.64297012e-01  2.56953275e-01  2.92889556e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  1.99932140e-01  1.32380422e-01  9.02969476e-02  1.59180175e-01
  4.42580077e-02  2.15288882e-01  1.26389442e-01  1.27422268e-01
  1.61095309e-01  1.10956233e-01  2.01626655e-01  1.33624890e-01
  1.37982195e-01  1.93203762e-01  2.19491936e-01  1.42852423e-01
  2.34872603e-01  1.65588035e-02  2.03278438e-01  8.71290829e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02
  8.68406395e-02  2.92094422e-01  1.61992484e-01  1.45719877e-01
  1.69154205e-01  2.40159763e-01  2.82334785e-01  2.40314539e-01
  2.11503214e-01  2.19885463e-01  1.05993513e-02 -8.26038514e-02
  4.71770719e-02  4.82053440e-02 -1.47655337e-01  1.05609359e-01
  3.13591769e-01  1.64929257e-01  4.01066121e-01 -6.04063991e-02
  2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
 -1.58904613e-02  2.19146072e-01  7.73234483e-02  7.03851745e-02
  2.04481197e-01  6.72933538e-02  3.58986724e-01  2.28395156e-01
 -3.05193582e-02  3.49541240e-01 -2.03378476e-01  1.57137180e-01
  9.85550806e-02 -8.14314649e-03  2.32877980e-01  3.37743024e-01
  3.70408064e-01  6.47419830e-02  3.87016141e-01 -9.02473108e-02
 -6.01749371e-01 -5.95796748e-01 -5.69591066e-01 -5.24091503e-01
 -5.19474237e-01 -5.88678477e-01 -5.17176210e-01 -5.59578470e-01
 -4.46874377e-01 -5.49106550e-01 -4.88680186e-01 -5.44483929e-01
 -5.25624139e-01 -5.02563131e-01 -5.56181669e-01 -4.80773887e-01
 -5.63531908e-01 -4.72477758e-01 -5.25462879e-01 -4.67489763e-01]

approx error on U for Validation Data after updating U  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.24953571779089506, 0.27812678886179165, 0.23172906368626495, 0.18605605800596603, 3.4120111785147245], [0.2359801995957806, 0.17454631451517882, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626522, 0.18521048307998722, 0.24180337799327972, 2.4155314944857715], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 1.1389804813710553], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.231729063686265, 0.1852247909918996, 0.23749716158177522, 0.2383365292620571, 3.850722734294883], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 1.1265321190341384], [0.2359749362072568, 0.17454631451518027, 0.25208694637014994, 0.24953571779089426, 0.23172906368626553, 0.18521048307998772, 0.24180337799328164, 0.2383365292620568, 0.2327913087433023, 0.46767774609366325], [0.2359725523240858, 0.17454631451517838, 0.2500443374202882, 0.23172906368626425, 0.18521048307998828, 0.2418033779932811, 0.24036103152164529, 0.2327913087433017, 0.2332278083383207, 1.456011281346867], [0.23597998638034925, 0.17454631451517122, 0.24845822490153918, 0.2317290636862653, 0.18521048307998697, 0.2418033779932828, 0.24383756647630053, 0.23279130874330223, 0.23329509638775076, 2.1603600993890644], [0.2359749362072569, 0.17454631451517943, 0.24953571779089279, 0.23172906368626509, 0.1852104830799876, 0.2418033779932828, 0.24689316079916757, 0.2327913087433004, 0.23329509638775003, 0.679969312911836]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0]

approximation 
 [-0.0127742   0.23284913  0.14382819  0.12529475  0.39347797 -0.04499912
  0.21542455  0.27947885  0.25202476  0.06865941  0.3680606   0.19134677
  0.15213241 -0.14348991  0.25898932  0.13409774  0.3892677   0.03415976
  0.52174964  0.41721824  0.29502796  0.04747601  0.0595384   0.14467091
  0.53297812  0.46731839  0.21841517  0.68368777  0.51034612  0.03653242
  0.70893118 -0.02365949  0.25678025  0.37296848  0.00540115  0.30289927
  0.16726632  0.3417999   0.23886064  0.4951185   0.05557521  0.28784423
  0.24731422  0.29747335  0.53350028  0.73941466  0.34589451  0.24934195
  0.53199793  0.13962667 -0.17612144  0.55715439  0.21669607  0.06921653
  0.10819931  0.70274103  0.33392581  0.1150315   0.59424107  0.13184625
  0.74291273  0.7543537   0.66684889  0.63126744  0.60531188  0.71988705
  0.66292434  0.69595846  0.55192229  0.74465389  0.57959178  0.69871813
  0.57376034  0.63937337  0.72838304  0.62367765  0.7341367   0.54114331
  0.67126988  0.45468343  3.87852792  3.96721318  3.77582346  4.10706157
  4.05313056  3.98045656  3.83128546  4.05292505  3.89990099  3.87054492
  4.24725031  3.67371327  3.91258905  3.90870051  3.8334382   3.52879144
  4.27608251  3.8215148   3.9988321   3.60837074]

approx error on V for Validation Data after updating V  [[0.23399029277961225, 0.3176322143187879, 0.9750657754550233, 0.8386835253310766, 0.2517734533343107], [0.23279130874330747, 0.2532403830116759, 0.3255196536083326, 0.3009350317044362, 2.9393271195314403], [0.32551965360833346, 0.975065775455022, 0.2888735743383605, 0.7886835253310979, 4.432259062249502], [0.25043448591242407, 0.23827803939343753, 0.25177345333431034, 0.3255196536083328, 2.7530619541388135], [0.2801670934847254, 0.29285042676224704, 0.3255196536083333, 0.7305313185785416, 0.28337915320427376], [0.3204360521311772, 0.2762407690224758, 0.3232414069233676, 0.7907449554635042, 1.603286478079267], [0.3191125347781929, 0.2549494177169502, 0.840979544853828, 0.29687441265243936, 1.9456818784130252], [0.222474916776272, 0.3212083717908041, 0.32324140692336656, 2.1373550407327784, 0.2903417350929703], [0.2837117035618166, 0.3067186096218296, 0.31367538672844225, 0.601948316266584, 3.811307629761837]]

overlaps  [[0, 1, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 1, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 1, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 1], [0, 0, 0, 0, 0, 0, 0, 0, 1]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668, 0.08888888888888888, 0.06666666666666667, 0.06666666666666667, 0.08888888888888888]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222, 0.2222222222222222, 0.2222222222222222, 0.2222222222222222, 0.2222222222222222]

 LLM_loss_on_U_V_len 300

 LLM_loss_on_U_V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0]

 W_V_val_shape  (300, 1780)

 W_V_val  [[0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 ...
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]
 [0. 0. 0. ... 0. 0. 0.]]

alpha shape  (1780,)

alpha  [2.27040609e-14 1.94289029e-15 8.43769499e-15 ... 0.00000000e+00
 0.00000000e+00 0.00000000e+00]

*************Approximation error of Validation Data on U ************

LLM Loss  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.94639028e-01  2.61513208e-01  1.51664388e-01  6.40768062e-02
  2.76326904e-01  2.62147741e-01  6.62516853e-02  3.63621255e-01
 -1.19445038e-01  4.65115540e-02  1.85184953e-02  4.85897522e-02
  1.87433318e-01  2.32731686e-01  9.33624135e-02  1.48595674e-01
 -7.19051433e-02  2.64289738e-01  2.56880608e-01  2.92946313e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  2.18431896e-01  1.23118561e-01  7.39857366e-02  1.62271357e-01
  2.02300365e-02  2.18282623e-01  1.25732661e-01  1.35514912e-01
  1.56017717e-01  1.24537761e-01  2.04370955e-01  1.33785628e-01
  1.31901294e-01  2.14588152e-01  2.30977484e-01  1.40928622e-01
  2.47084483e-01  8.64868734e-04  2.07635408e-01  6.61524035e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02
  1.61649912e-01  2.95440858e-01  1.37755627e-01  1.31129624e-01
  1.72786645e-01  1.80805768e-01  2.62241486e-01  3.27874447e-01
  1.44536731e-01  2.33741542e-01  7.60294215e-03 -8.20285806e-02
  2.05080658e-02  7.18148304e-02 -1.11426149e-01  5.34113937e-02
  2.88166020e-01  1.90385902e-01  3.57708700e-01  1.37044433e-02
  2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
 -8.18964730e-03  2.23412999e-01  8.40408620e-02  6.66026354e-02
  2.01848981e-01  7.66348409e-02  3.52346536e-01  2.30140100e-01
 -3.49897168e-02  3.46138069e-01 -2.06833112e-01  1.60886773e-01
  9.90572821e-02 -1.22816796e-02  2.37940941e-01  3.36168545e-01
  3.69262779e-01  5.23195853e-02  3.83718621e-01 -8.17969926e-02
  1.05163238e-01  1.30752893e-01  1.66914738e-01  1.58458437e-01
  2.10286993e-01  1.39681302e-01  1.95391418e-01  1.83117226e-01
  1.49206771e-01  1.11355663e-01  1.38567163e-01  9.07140648e-02
  1.58524273e-01  2.01934211e-01  1.09717589e-01  1.30727300e-01
  7.38902386e-02  1.71259543e-01  1.53996398e-01  1.95320833e-01]

approx error on U on val data  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.23631374392237953, 0.24953571779089506, 0.27812678886179165, 0.231729063686265, 0.18605605800596603], [0.2359801995957806, 0.17454631451517882, 0.28965629609284577, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626528, 0.18521048307998722, 0.2418033779932797], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 0.27373550017628107], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.278126788861791, 0.23172906368626486, 0.1852247909918996, 0.23749716158177522, 0.23833652926205703], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 0.28016709348472546], [0.2359749362072568, 0.17454631451518027, 0.25208694637014994, 0.18579675075769767, 0.24953571779089426, 0.2317290636862655, 0.18521048307998772, 0.2418033779932817, 0.2383365292620568, 0.23279130874330228], [0.2359725523240858, 0.17454631451517838, 0.2519649221737047, 0.2500443374202882, 0.2317290636862642, 0.18521048307998828, 0.2418033779932811, 0.24036103152164529, 0.23279130874330162, 0.2332278083383207], [0.23597998638034925, 0.17454631451517122, 0.24845822490153918, 0.2317290636862653, 0.18521048307998697, 0.2418033779932828, 0.24383756647630053, 0.23279130874330223, 0.23329509638775076, 0.23489463822875897], [0.2359749362072569, 0.17454631451517943, 0.24953571779089279, 0.23172906368626509, 0.1852104830799876, 0.2418033779932828, 0.24689316079916757, 0.2327913087433004, 0.23329509638775003, 0.3328325452468892], [0.2359801995957807, 0.1745463145151777, 0.24841273376533715, 0.23172906368626456, 0.1852104830799876, 0.24180337799328347, 0.23833652926205778, 0.2327913087433006, 0.2332278083383202, 0.2519649221737019]]

*************Approximation error of Validation Data on V ************

LLM Loss on V  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0]

approximation 
 [ 6.11976202e-03  1.94294965e-01 -1.83081481e-02  2.05438272e-02
  1.71945158e-01  1.24826039e-01  3.95425572e-01  5.28633404e-01
  3.36925489e-01 -1.59963961e-01  3.04150945e-01  1.86947686e-01
  3.31987057e-02 -1.00719394e-01  4.04673408e-03 -8.92355636e-04
  3.98989212e-01  1.55283131e-01  6.29807553e-01  6.73928251e-01
  2.59165607e-01  1.84475921e-01  2.06905789e-01  6.46562329e-02
  3.80164905e-01  3.78958351e-01  2.47963681e-01  4.22877763e-01
  3.76293171e-01  2.02050384e-01  4.00843094e-01  2.28426220e-02
  1.86277313e-01  4.08720368e-01 -1.29369414e-02  2.95791380e-01
  1.98996753e-02  3.52408903e-01  1.61483522e-01  1.87396600e-01
 -1.02149837e-01 -3.31989209e-03  1.93520169e-01  1.59160243e-01
  4.59697939e-01  3.73294716e-01  2.21980208e-01  5.30206248e-02
  3.30026857e-01 -2.67942976e-02 -7.56255993e-02  3.21481631e-01
  2.45081830e-01  2.07221484e-01  4.09508044e-02  7.96907154e-01
  3.27606961e-01  5.06010418e-02  4.10322468e-01  7.55151320e-02
  6.12385393e-02  2.32137165e-01  1.98688789e-01  1.84139459e-01
  1.16989540e-01  3.26926623e-01  3.91505268e-01  3.30080123e-01
  3.36053285e-01  6.67340547e-02 -2.46642291e-02 -1.50550206e-01
  1.57224091e-02 -1.18392278e-01  1.24946606e-02  3.60293611e-01
  2.44397853e-01 -2.51420642e-01  5.39348613e-01  6.03289277e-02
  1.33494746e-01  1.13550975e-01  1.05053010e-01  9.31156172e-02
  2.25478961e-01  1.22069924e-01  7.57616684e-02  4.31888081e-01
  3.06135471e-01 -3.64938413e-02  3.24430846e-01 -3.24638933e-01
  9.62493481e-03  2.02230237e-01 -2.57978722e-01  1.49662923e-01
  1.15418423e-01  4.83352025e-04  9.40258723e-02  1.28000580e-01]

approx error on V on Val data  [[0.3005124651610089, 0.24023287340261276, 0.23399029277961225, 0.25178130044886216, 0.22510235601642412], [0.23279130874330747, 0.3255196536083326, 0.21561693928679454, 0.2281330790423822, 0.22882604559648972], [0.23279130874329965, 0.23322780833832102, 0.32551965360833346, 0.32449813576447795, 0.22510235601641848], [0.3255196536083328, 0.21561693928679473, 0.2363137439223792, 0.2529949237066617, 0.23827803939343753], [0.22813307904238087, 0.2746735597137788, 0.26428196002047094, 0.3255196536083333, 0.22330710034162826], [0.2731448354888367, 0.2313399246190735, 0.32551965360833324, 0.28631706390421147, 0.26699360191285193], [0.2857164443567357, 0.22328076862823512, 0.29122338458898833, 0.26780198579212855, 0.26297394789383066], [0.3255196536083329, 0.22545299506733402, 0.2866388595737954, 0.3012694433198483, 0.24619654058787127], [0.17961176674021845, 0.3257712538291143, 0.29122338458898794, 0.286085730460546, 0.2730264128375919], [0.23901059381506645, 0.31911050955377307, 0.2517191259355331, 0.22813307904238028, 0.20432175555437765]]

predicting:   0%|          | 0/1 [00:00<?, ?it/s]
predicting: 100%|██████████| 1/1 [00:57<00:00, 57.37s/it]
predicting: 100%|██████████| 1/1 [00:57<00:00, 57.37s/it]

Make new V by taking top v highest loss subsets from L \ U

predicting:   0%|          | 0/5 [00:00<?, ?it/s]
predicting:  20%|██        | 1/5 [01:01<04:04, 61.00s/it]
predicting:  40%|████      | 2/5 [02:01<03:01, 60.67s/it]
predicting:  60%|██████    | 3/5 [03:07<02:06, 63.19s/it]
predicting:  80%|████████  | 4/5 [04:18<01:06, 66.30s/it]
predicting: 100%|██████████| 5/5 [05:19<00:00, 64.20s/it]
predicting: 100%|██████████| 5/5 [05:19<00:00, 63.83s/it]

***********************************
S_worst_ind  9

********* LLM LOSS ON U ON VALIDATION DATA *********

LLM_loss_on_val  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0]

AVG_LLM_loss_on_VAL_data  [0.15499999999999997, 0.15, 0.15499999999999997, 0.14499999999999996, 0.145, 0.13999999999999999, 0.13999999999999999, 0.13999999999999999, 0.155, 0.13999999999999999, 0.15]

MIN_LLM_loss_on_VAL_data  [0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.2, 0.25, 0.2, 0.2, 0.15, 0.15, 0.15, 0.3, 0.15, 0.25]

********* LLM LOSS ON V FOR VALIDATION DATA *********

LLM_loss_on_val  [[0.2, 0.15, 0.15, 0.15, 0.15], [0.15, 0.25, 0.15, 0.15, 0.25], [0.15, 0.15, 0.25, 0.25, 0.15], [0.25, 0.15, 0.2, 0.2, 0.2], [0.15, 0.2, 0.25, 0.25, 0.15], [0.2, 0.15, 0.25, 0.2, 0.15], [0.2, 0.15, 0.2, 0.2, 0.2], [0.25, 0.15, 0.2, 0.25, 0.15], [0.1, 0.3, 0.2, 0.2, 0.2], [0.2, 0.25, 0.2, 0.15, 0.1], [0.1, 0.25, 0.15, 0.3, 0.25]]

AVG_LLM_loss_on_VAL_data  [0.16, 0.19, 0.19, 0.2, 0.2, 0.19, 0.19, 0.2, 0.2, 0.18, 0.21000000000000002]

MIN_LLM_loss_on_VAL_data  [0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.15, 0.1, 0.1, 0.1]

MAX_LLM_loss_on_VAL_data  [0.2, 0.25, 0.25, 0.25, 0.25, 0.25, 0.2, 0.25, 0.3, 0.25, 0.3]

*************Approximation error of Validation Data on U after updating U************

Updated LLM Loss on U for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0]

approximation 
 [ 2.94639028e-01  2.61513208e-01  1.51664388e-01  6.40768062e-02
  2.76326904e-01  2.62147741e-01  6.62516853e-02  3.63621255e-01
 -1.19445038e-01  4.65115540e-02  1.85184953e-02  4.85897522e-02
  1.87433318e-01  2.32731686e-01  9.33624135e-02  1.48595674e-01
 -7.19051433e-02  2.64289738e-01  2.56880608e-01  2.92946313e-02
 -2.42590376e-02 -3.66931119e-02  3.75348144e-02 -4.57926123e-02
  1.75599097e-01  1.11119024e-01  1.24418644e-01  9.86343736e-02
  4.40813436e-02 -3.19091786e-03  1.07856934e-01  1.43786123e-01
  1.54234435e-01  2.21492957e-01  1.88581635e-01  2.47327771e-01
  2.29185049e-01  3.90396852e-02  1.13641251e-01  3.63817636e-02
  2.18431896e-01  1.23118561e-01  7.39857366e-02  1.62271357e-01
  2.02300365e-02  2.18282623e-01  1.25732661e-01  1.35514912e-01
  1.56017717e-01  1.24537761e-01  2.04370955e-01  1.33785628e-01
  1.31901294e-01  2.14588152e-01  2.30977484e-01  1.40928622e-01
  2.47084483e-01  8.64868734e-04  2.07635408e-01  6.61524035e-02
  3.18725490e-01  1.30117878e-01  3.89806022e-01  3.97117023e-01
  1.08572133e-01  1.53239477e-01  2.47044143e-02  3.79819665e-01
 -1.06687082e-01 -1.24612682e-01  1.40057905e-02  2.12500353e-01
  2.17238679e-01  1.40903347e-01 -6.27486213e-02  3.57447307e-01
  4.84023349e-02 -5.07771717e-03  1.95218580e-01  1.12607781e-01
 -3.25428574e-03  2.31268014e-01  2.85611812e-01 -2.00362652e-01
  1.41202070e-01 -4.22206954e-02  1.08485084e-01  1.29151875e-02
  8.52573686e-02  2.34457172e-01  2.15468373e-01  3.17033308e-02
 -1.72852966e-02  1.85554472e-01  7.62746756e-02  4.15389011e-01
  2.04636976e-01 -2.04611568e-01  2.29400926e-01 -6.84305639e-02
  1.72581330e-01  1.68512897e-01 -1.16605987e-01  1.44946921e-03
  3.24584521e-01  3.47627329e-02  2.28954908e-01  3.72599516e-01
  2.50663192e-01 -9.68936044e-03 -1.84734914e-02  1.36609025e-01
  2.40023918e-01  1.70617510e-01 -1.64125181e-01  1.65634242e-01
  1.78974018e-01  3.94728679e-01  2.81791918e-01  4.47370184e-02
  1.61649912e-01  2.95440858e-01  1.37755627e-01  1.31129624e-01
  1.72786645e-01  1.80805768e-01  2.62241486e-01  3.27874447e-01
  1.44536731e-01  2.33741542e-01  7.60294215e-03 -8.20285806e-02
  2.05080658e-02  7.18148304e-02 -1.11426149e-01  5.34113937e-02
  2.88166020e-01  1.90385902e-01  3.57708700e-01  1.37044433e-02
  2.12844989e-01  1.83192736e-01  1.35277795e-04  4.01236574e-02
  2.33508582e-01  2.41178540e-01  3.32741016e-01  3.45372927e-01
  7.10685492e-02  2.53728403e-01 -5.61827411e-02  1.80510149e-01
 -1.40396660e-01 -8.70491489e-02  1.04542216e-01  4.36931861e-01
  1.60661015e-01  6.22413400e-02  1.46327575e-01  2.24353519e-01
 -8.18964730e-03  2.23412999e-01  8.40408620e-02  6.66026354e-02
  2.01848981e-01  7.66348409e-02  3.52346536e-01  2.30140100e-01
 -3.49897168e-02  3.46138069e-01 -2.06833112e-01  1.60886773e-01
  9.90572821e-02 -1.22816796e-02  2.37940941e-01  3.36168545e-01
  3.69262779e-01  5.23195853e-02  3.83718621e-01 -8.17969926e-02
 -7.77402406e-01 -6.69972422e-01 -6.59141831e-01 -7.76801219e-01
 -6.94402025e-01 -6.72696566e-01 -6.53469651e-01 -7.26321495e-01
 -6.61139474e-01 -6.63253860e-01 -6.59655397e-01 -7.03193604e-01
 -6.97401167e-01 -6.39642164e-01 -6.90912915e-01 -5.86580377e-01
 -6.77440543e-01 -7.52085394e-01 -6.03933149e-01 -6.83548907e-01]

approx error on U for Validation Data after updating U  [[0.23597493620725704, 0.1745463145151919, 0.2896562960928456, 0.2515617647293288, 0.18579675075769647, 0.24953571779089506, 0.27812678886179165, 0.23172906368626495, 0.18605605800596603, 3.4120111785147245], [0.2359801995957806, 0.17454631451517882, 0.25156176472932923, 0.18579675075769692, 0.2484127337653371, 0.278126788861791, 0.23172906368626522, 0.18521048307998722, 0.24180337799327972, 2.4155314944857715], [0.235974936207257, 0.17454631451518524, 0.25196492217370514, 0.18579675075769822, 0.2495357177908925, 0.2781267888617943, 0.23172906368626495, 0.18521048307998517, 0.24180337799328488, 1.1389804813710553], [0.23599774115009317, 0.1745463145151795, 0.2520869463701499, 0.18579675075769758, 0.2561746953448757, 0.231729063686265, 0.1852247909918996, 0.23749716158177522, 0.2383365292620571, 3.850722734294883], [0.23598019959578034, 0.17454631451517924, 0.25208694637014933, 0.1857967507577004, 0.24841273376533665, 0.23172906368626375, 0.18522479099189537, 0.23749716158177395, 0.24132755390619084, 1.1265321190341384], [0.2359749362072568, 0.17454631451518027, 0.25208694637014994, 0.24953571779089426, 0.23172906368626553, 0.18521048307998772, 0.24180337799328164, 0.2383365292620568, 0.2327913087433023, 0.46767774609366325], [0.2359725523240858, 0.17454631451517838, 0.2500443374202882, 0.23172906368626425, 0.18521048307998828, 0.2418033779932811, 0.24036103152164529, 0.2327913087433017, 0.2332278083383207, 1.456011281346867], [0.23597998638034925, 0.17454631451517122, 0.24845822490153918, 0.2317290636862653, 0.18521048307998697, 0.2418033779932828, 0.24383756647630053, 0.23279130874330223, 0.23329509638775076, 2.1603600993890644], [0.2359749362072569, 0.17454631451517943, 0.24953571779089279, 0.23172906368626509, 0.1852104830799876, 0.2418033779932828, 0.24689316079916757, 0.2327913087433004, 0.23329509638775003, 0.679969312911836], [0.2359801995957807, 0.1745463145151777, 0.24841273376533715, 0.23172906368626456, 0.1852104830799876, 0.24180337799328347, 0.23833652926205778, 0.2327913087433006, 0.2332278083383202, 0.9324497283937646]]

*************Approximation error of Validation Data on V after updating V************

Updated LLM Loss on V for Validation Data  [0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 1, 0, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 0, 1, 0, 0, 1, 0]

approximation 
 [ 1.05163238e-01  1.30752893e-01  1.66914738e-01  1.58458437e-01
  2.10286993e-01  1.39681302e-01  1.95391418e-01  1.83117226e-01
  1.49206771e-01  1.11355663e-01  1.38567163e-01  9.07140648e-02
  1.58524273e-01  2.01934211e-01  1.09717589e-01  1.30727300e-01
  7.38902386e-02  1.71259543e-01  1.53996398e-01  1.95320833e-01
 -1.02149837e-01 -3.31989209e-03  1.93520169e-01  1.59160243e-01
  4.59697939e-01  3.73294716e-01  2.21980208e-01  5.30206248e-02
  3.30026857e-01 -2.67942976e-02 -7.56255993e-02  3.21481631e-01
  2.45081830e-01  2.07221484e-01  4.09508044e-02  7.96907154e-01
  3.27606961e-01  5.06010418e-02  4.10322468e-01  7.55151320e-02
  6.11976202e-03  1.94294965e-01 -1.83081481e-02  2.05438272e-02
  1.71945158e-01  1.24826039e-01  3.95425572e-01  5.28633404e-01
  3.36925489e-01 -1.59963961e-01  3.04150945e-01  1.86947686e-01
  3.31987057e-02 -1.00719394e-01  4.04673408e-03 -8.92355636e-04
  3.98989212e-01  1.55283131e-01  6.29807553e-01  6.73928251e-01
  2.59165607e-01  1.84475921e-01  2.06905789e-01  6.46562329e-02
  3.80164905e-01  3.78958351e-01  2.47963681e-01  4.22877763e-01
  3.76293171e-01  2.02050384e-01  4.00843094e-01  2.28426220e-02
  1.86277313e-01  4.08720368e-01 -1.29369414e-02  2.95791380e-01
  1.98996753e-02  3.52408903e-01  1.61483522e-01  1.87396600e-01
  4.24512499e+00  4.35168732e+00  4.57949881e+00  4.12116991e+00
  4.27303523e+00  4.05600612e+00  4.61896980e+00  4.33163097e+00
  4.01340061e+00  4.14434611e+00  3.96670735e+00  4.16633678e+00
  4.38351087e+00  4.26852937e+00  4.21212381e+00  3.86877394e+00
  4.53847623e+00  3.98390109e+00  4.34286378e+00  3.84564481e+00]

approx error on V for Validation Data after updating V  [[0.23399029277961225, 0.3176322143187879, 0.9750657754550233, 0.8386835253310766, 0.2517734533343107], [0.23279130874330747, 0.2532403830116759, 0.3255196536083326, 0.3009350317044362, 2.9393271195314403], [0.32551965360833346, 0.975065775455022, 0.2888735743383605, 0.7886835253310979, 4.432259062249502], [0.25043448591242407, 0.23827803939343753, 0.25177345333431034, 0.3255196536083328, 2.7530619541388135], [0.2801670934847254, 0.29285042676224704, 0.3255196536083333, 0.7305313185785416, 0.28337915320427376], [0.3204360521311772, 0.2762407690224758, 0.3232414069233676, 0.7907449554635042, 1.603286478079267], [0.3191125347781929, 0.2549494177169502, 0.840979544853828, 0.29687441265243936, 1.9456818784130252], [0.222474916776272, 0.3212083717908041, 0.32324140692336656, 2.1373550407327784, 0.2903417350929703], [0.2837117035618166, 0.3067186096218296, 0.31367538672844225, 0.601948316266584, 3.811307629761837], [0.21736456196465426, 0.2643896543498108, 0.25640341890529456, 0.35296215732558667, 3.965586895301901]]

overlaps  [[0, 1, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [1, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 0, 1, 0, 1, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 1, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0]]
len overlaps  10

********* PAIRWISE OVERLAP *********

overlap_for_subset  [[0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.0, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.0, 0.0], [0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111], [0.1111111111111111, 0.0, 0.1111111111111111, 0.0, 0.1111111111111111, 0.2222222222222222, 0.0, 0.1111111111111111, 0.0, 0.0]]

AVG_overlap  [0.04444444444444444, 0.06666666666666668, 0.04444444444444444, 0.04444444444444444, 0.04444444444444444, 0.06666666666666668, 0.08888888888888888, 0.06666666666666667, 0.06666666666666667, 0.08888888888888888, 0.06666666666666667]
MIN_overlap  [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
MAX_overlap  [0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.1111111111111111, 0.2222222222222222, 0.2222222222222222, 0.2222222222222222, 0.2222222222222222, 0.2222222222222222, 0.2222222222222222]
while loop completed!



_____________Take the exemplar with minimum validation loss and use it as the exemplar

predicting:   0%|          | 0/10 [00:00<?, ?it/s]
predicting:  10%|█         | 1/10 [00:58<08:48, 58.73s/it]
predicting:  20%|██        | 2/10 [01:50<07:18, 54.85s/it]
predicting:  30%|███       | 3/10 [03:06<07:29, 64.23s/it]
predicting:  40%|████      | 4/10 [04:07<06:17, 62.97s/it]
predicting:  50%|█████     | 5/10 [05:23<05:38, 67.63s/it]
predicting:  60%|██████    | 6/10 [06:52<04:59, 74.97s/it]
predicting:  70%|███████   | 7/10 [08:06<03:44, 74.80s/it]
predicting:  80%|████████  | 8/10 [09:05<02:18, 69.50s/it]
predicting:  90%|█████████ | 9/10 [10:02<01:05, 65.78s/it]
predicting: 100%|██████████| 10/10 [11:05<00:00, 64.81s/it]
predicting: 100%|██████████| 10/10 [11:05<00:00, 66.53s/it]


avg_err  [0.15, 0.05, 0.05, 0.15, 0.1, 0.2, 0.15, 0.15, 0.15, 0.25]


min ind  1

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  No
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  Yes
GT:  No

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  No
GT:  No

Answer:  No
GT:  No

Answer:  Yes
GT:  Yes

Answer:  Yes
GT:  No
EM: 0.7693877551020408
                                              question answers
0    Does Rusev have to worry about human overpopul...      No
1      Was Eve involved in an incestuous relationship?      No
2      Does The Hague border multiple bodies of water?     Yes
3    Could casualties from deadliest war rival Fran...     Yes
4    Is letter C crucial to spelling the two most c...      No
..                                                 ...     ...
485        Is Dustin Hoffman one of the B'nei Yisrael?     Yes
486           Can you avoid internet trolls on reddit?      No
487     Did Moon Jae-in earn the Abitur as a teenager?      No
488  Did Tokyo Tower designers appreciate Stephen S...     Yes
489  Does Iphone have more iterations than Samsung ...     Yes

[490 rows x 2 columns]
