Keras initialize large embeddings layer with pretrained embeddings











up vote
3
down vote

favorite












I am trying to re-train a word2vec model in Keras 2 with Tensorflow backend by using pretrained embeddings and custom corpus.



This is how I initialize the embeddings layer with pretrained embeddings:



embedding = Embedding(vocab_size, embedding_dim,
input_length=1, name='embedding',
embeddings_initializer=lambda x: pretrained_embeddings)


where pretrained_embeddings is a big matrix of size vocab_size x embedding_dim



This works as long as pretrained_embeddings is not too big.



In my case unfortunately this is not the case - vocab_size=2270872 and embedding_dim=300.



Upon initializing the Embeddings layer I get the error:



Cannot create a tensor proto whose content is larger than 2GB.



The error comes from the function add_weight() in
/opt/r/anaconda3/lib/python3.6/site-packages/keras/engine/base_layer.py, more specifically the following line:



weight = K.variable(initializer(shape),
dtype=dtype,
name=name,
constraint=constraint)


initializer is the lambda function from above, which returns the big matrix. shape is (2270872, 300) as already mentioned.



Is it possible to solve this issue without having to go to low-level Tensorflow programming ? If I switch to Theano as a backend the code runs fine, but I'd like to use Tensorflow for its better long-term prospects.



The only similar Stackoverflow question I found was this, which proposes placeholder variables, but I am not sure how I can apply them on the level of Keras.



Thanks a lot



Edit:
I am more than willing to work around this issue on the level of the Tensorflow backend. It's just that I don't know how to combine in this case Tensorflow and Keras code in the same application. Most examples are either one or the other, not both.



For example, what use are the Tensorflow placeholder variables when the initialization of the Embeddings layer in Keras will inevitably invoke the add_weight() function, which causes the issue ?



Solution:



As hinted by in @blue-phoenox's comment I rewrote the code like this:



embedding = Embedding(vocab_size, embedding_dim,
input_length=1,
name='embedding')
embedding.build(input_shape=(1,)) # the input_shape here has no effect in the build function
embedding.set_weights([pretrained_embeddings])


That did it. Thanks again @blue-phoenox.










share|improve this question
























  • Probably this helps: stackoverflow.com/questions/35394103/…
    – Digital-Thinking
    Nov 21 at 21:39










  • actually this is the link, which I also referred to at the end of the question. Unfortunately I do nit know how to make use of it in my case.
    – Pavlin Mavrodiev
    Nov 22 at 8:34










  • ok, sorry I missed thtat link
    – Digital-Thinking
    Nov 22 at 8:53






  • 1




    What about just setting the weights instead of initializing?stackoverflow.com/questions/51819213/…
    – blue-phoenox
    Nov 22 at 12:49






  • 1




    @blue-phoenox Thanks. That did it. Can you post your reply as a separate comment so that I can select it as the best answer ?
    – Pavlin Mavrodiev
    Nov 22 at 15:35















up vote
3
down vote

favorite












I am trying to re-train a word2vec model in Keras 2 with Tensorflow backend by using pretrained embeddings and custom corpus.



This is how I initialize the embeddings layer with pretrained embeddings:



embedding = Embedding(vocab_size, embedding_dim,
input_length=1, name='embedding',
embeddings_initializer=lambda x: pretrained_embeddings)


where pretrained_embeddings is a big matrix of size vocab_size x embedding_dim



This works as long as pretrained_embeddings is not too big.



In my case unfortunately this is not the case - vocab_size=2270872 and embedding_dim=300.



Upon initializing the Embeddings layer I get the error:



Cannot create a tensor proto whose content is larger than 2GB.



The error comes from the function add_weight() in
/opt/r/anaconda3/lib/python3.6/site-packages/keras/engine/base_layer.py, more specifically the following line:



weight = K.variable(initializer(shape),
dtype=dtype,
name=name,
constraint=constraint)


initializer is the lambda function from above, which returns the big matrix. shape is (2270872, 300) as already mentioned.



Is it possible to solve this issue without having to go to low-level Tensorflow programming ? If I switch to Theano as a backend the code runs fine, but I'd like to use Tensorflow for its better long-term prospects.



The only similar Stackoverflow question I found was this, which proposes placeholder variables, but I am not sure how I can apply them on the level of Keras.



Thanks a lot



Edit:
I am more than willing to work around this issue on the level of the Tensorflow backend. It's just that I don't know how to combine in this case Tensorflow and Keras code in the same application. Most examples are either one or the other, not both.



For example, what use are the Tensorflow placeholder variables when the initialization of the Embeddings layer in Keras will inevitably invoke the add_weight() function, which causes the issue ?



Solution:



As hinted by in @blue-phoenox's comment I rewrote the code like this:



embedding = Embedding(vocab_size, embedding_dim,
input_length=1,
name='embedding')
embedding.build(input_shape=(1,)) # the input_shape here has no effect in the build function
embedding.set_weights([pretrained_embeddings])


That did it. Thanks again @blue-phoenox.










share|improve this question
























  • Probably this helps: stackoverflow.com/questions/35394103/…
    – Digital-Thinking
    Nov 21 at 21:39










  • actually this is the link, which I also referred to at the end of the question. Unfortunately I do nit know how to make use of it in my case.
    – Pavlin Mavrodiev
    Nov 22 at 8:34










  • ok, sorry I missed thtat link
    – Digital-Thinking
    Nov 22 at 8:53






  • 1




    What about just setting the weights instead of initializing?stackoverflow.com/questions/51819213/…
    – blue-phoenox
    Nov 22 at 12:49






  • 1




    @blue-phoenox Thanks. That did it. Can you post your reply as a separate comment so that I can select it as the best answer ?
    – Pavlin Mavrodiev
    Nov 22 at 15:35













up vote
3
down vote

favorite









up vote
3
down vote

favorite











I am trying to re-train a word2vec model in Keras 2 with Tensorflow backend by using pretrained embeddings and custom corpus.



This is how I initialize the embeddings layer with pretrained embeddings:



embedding = Embedding(vocab_size, embedding_dim,
input_length=1, name='embedding',
embeddings_initializer=lambda x: pretrained_embeddings)


where pretrained_embeddings is a big matrix of size vocab_size x embedding_dim



This works as long as pretrained_embeddings is not too big.



In my case unfortunately this is not the case - vocab_size=2270872 and embedding_dim=300.



Upon initializing the Embeddings layer I get the error:



Cannot create a tensor proto whose content is larger than 2GB.



The error comes from the function add_weight() in
/opt/r/anaconda3/lib/python3.6/site-packages/keras/engine/base_layer.py, more specifically the following line:



weight = K.variable(initializer(shape),
dtype=dtype,
name=name,
constraint=constraint)


initializer is the lambda function from above, which returns the big matrix. shape is (2270872, 300) as already mentioned.



Is it possible to solve this issue without having to go to low-level Tensorflow programming ? If I switch to Theano as a backend the code runs fine, but I'd like to use Tensorflow for its better long-term prospects.



The only similar Stackoverflow question I found was this, which proposes placeholder variables, but I am not sure how I can apply them on the level of Keras.



Thanks a lot



Edit:
I am more than willing to work around this issue on the level of the Tensorflow backend. It's just that I don't know how to combine in this case Tensorflow and Keras code in the same application. Most examples are either one or the other, not both.



For example, what use are the Tensorflow placeholder variables when the initialization of the Embeddings layer in Keras will inevitably invoke the add_weight() function, which causes the issue ?



Solution:



As hinted by in @blue-phoenox's comment I rewrote the code like this:



embedding = Embedding(vocab_size, embedding_dim,
input_length=1,
name='embedding')
embedding.build(input_shape=(1,)) # the input_shape here has no effect in the build function
embedding.set_weights([pretrained_embeddings])


That did it. Thanks again @blue-phoenox.










share|improve this question















I am trying to re-train a word2vec model in Keras 2 with Tensorflow backend by using pretrained embeddings and custom corpus.



This is how I initialize the embeddings layer with pretrained embeddings:



embedding = Embedding(vocab_size, embedding_dim,
input_length=1, name='embedding',
embeddings_initializer=lambda x: pretrained_embeddings)


where pretrained_embeddings is a big matrix of size vocab_size x embedding_dim



This works as long as pretrained_embeddings is not too big.



In my case unfortunately this is not the case - vocab_size=2270872 and embedding_dim=300.



Upon initializing the Embeddings layer I get the error:



Cannot create a tensor proto whose content is larger than 2GB.



The error comes from the function add_weight() in
/opt/r/anaconda3/lib/python3.6/site-packages/keras/engine/base_layer.py, more specifically the following line:



weight = K.variable(initializer(shape),
dtype=dtype,
name=name,
constraint=constraint)


initializer is the lambda function from above, which returns the big matrix. shape is (2270872, 300) as already mentioned.



Is it possible to solve this issue without having to go to low-level Tensorflow programming ? If I switch to Theano as a backend the code runs fine, but I'd like to use Tensorflow for its better long-term prospects.



The only similar Stackoverflow question I found was this, which proposes placeholder variables, but I am not sure how I can apply them on the level of Keras.



Thanks a lot



Edit:
I am more than willing to work around this issue on the level of the Tensorflow backend. It's just that I don't know how to combine in this case Tensorflow and Keras code in the same application. Most examples are either one or the other, not both.



For example, what use are the Tensorflow placeholder variables when the initialization of the Embeddings layer in Keras will inevitably invoke the add_weight() function, which causes the issue ?



Solution:



As hinted by in @blue-phoenox's comment I rewrote the code like this:



embedding = Embedding(vocab_size, embedding_dim,
input_length=1,
name='embedding')
embedding.build(input_shape=(1,)) # the input_shape here has no effect in the build function
embedding.set_weights([pretrained_embeddings])


That did it. Thanks again @blue-phoenox.







tensorflow keras






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 22 at 15:37

























asked Nov 21 at 17:23









Pavlin Mavrodiev

407




407












  • Probably this helps: stackoverflow.com/questions/35394103/…
    – Digital-Thinking
    Nov 21 at 21:39










  • actually this is the link, which I also referred to at the end of the question. Unfortunately I do nit know how to make use of it in my case.
    – Pavlin Mavrodiev
    Nov 22 at 8:34










  • ok, sorry I missed thtat link
    – Digital-Thinking
    Nov 22 at 8:53






  • 1




    What about just setting the weights instead of initializing?stackoverflow.com/questions/51819213/…
    – blue-phoenox
    Nov 22 at 12:49






  • 1




    @blue-phoenox Thanks. That did it. Can you post your reply as a separate comment so that I can select it as the best answer ?
    – Pavlin Mavrodiev
    Nov 22 at 15:35


















  • Probably this helps: stackoverflow.com/questions/35394103/…
    – Digital-Thinking
    Nov 21 at 21:39










  • actually this is the link, which I also referred to at the end of the question. Unfortunately I do nit know how to make use of it in my case.
    – Pavlin Mavrodiev
    Nov 22 at 8:34










  • ok, sorry I missed thtat link
    – Digital-Thinking
    Nov 22 at 8:53






  • 1




    What about just setting the weights instead of initializing?stackoverflow.com/questions/51819213/…
    – blue-phoenox
    Nov 22 at 12:49






  • 1




    @blue-phoenox Thanks. That did it. Can you post your reply as a separate comment so that I can select it as the best answer ?
    – Pavlin Mavrodiev
    Nov 22 at 15:35
















Probably this helps: stackoverflow.com/questions/35394103/…
– Digital-Thinking
Nov 21 at 21:39




Probably this helps: stackoverflow.com/questions/35394103/…
– Digital-Thinking
Nov 21 at 21:39












actually this is the link, which I also referred to at the end of the question. Unfortunately I do nit know how to make use of it in my case.
– Pavlin Mavrodiev
Nov 22 at 8:34




actually this is the link, which I also referred to at the end of the question. Unfortunately I do nit know how to make use of it in my case.
– Pavlin Mavrodiev
Nov 22 at 8:34












ok, sorry I missed thtat link
– Digital-Thinking
Nov 22 at 8:53




ok, sorry I missed thtat link
– Digital-Thinking
Nov 22 at 8:53




1




1




What about just setting the weights instead of initializing?stackoverflow.com/questions/51819213/…
– blue-phoenox
Nov 22 at 12:49




What about just setting the weights instead of initializing?stackoverflow.com/questions/51819213/…
– blue-phoenox
Nov 22 at 12:49




1




1




@blue-phoenox Thanks. That did it. Can you post your reply as a separate comment so that I can select it as the best answer ?
– Pavlin Mavrodiev
Nov 22 at 15:35




@blue-phoenox Thanks. That did it. Can you post your reply as a separate comment so that I can select it as the best answer ?
– Pavlin Mavrodiev
Nov 22 at 15:35












1 Answer
1






active

oldest

votes

















up vote
2
down vote



accepted










Instead of using the embeddings_initializer argument of the Embedding layer you can load pre-trained weights for your embedding layer using the weights argument, this way you should be able to hand over pre-trained embeddings larger than 2GB.



Here is a short example:



from keras.layers import Embedding

embedding_layer = Embedding(vocab_size,
EMBEDDING_DIM,
weights=[embedding_matrix],
input_length=MAX_SEQUENCE_LENGTH,
trainable=False)


Where embedding_matrix is just a regular numpy matrix containing your weights.



For for examples you can also take a look here:
https://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html





Edit:



As @PavlinMavrodiev (see end of question) pointed out correctly the weights argument is deprecated. He instead used the layer method set_weights to set the weights instead:





  • layer.set_weights(weights): sets the weights of the layer from a list
    of Numpy arrays (with the same shapes as the output of get_weights).




To get trained weights get_weights can be used:





  • layer.get_weights(): returns the weights of the layer as a list of
    Numpy arrays.




Both are methods from the Keras Layer-Baseclass and can be used for all keras layers, including embeddings layer.






share|improve this answer



















  • 1




    I am accepting this as the solution to the question, although my implementation is slightly different. I've put the latter as an edit in the original question. The reason is that the weights argument to the Embedding layer appears to be outdated, although it works at the moment,. It is not mentioned in the latest Keras 2 documentation. I believe I've implemented a more future-proof version.
    – Pavlin Mavrodiev
    Nov 23 at 17:42












  • @PavlinMavrodiev thanks for sharing and the reply!
    – blue-phoenox
    Nov 23 at 17:59











Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53417537%2fkeras-initialize-large-embeddings-layer-with-pretrained-embeddings%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes








up vote
2
down vote



accepted










Instead of using the embeddings_initializer argument of the Embedding layer you can load pre-trained weights for your embedding layer using the weights argument, this way you should be able to hand over pre-trained embeddings larger than 2GB.



Here is a short example:



from keras.layers import Embedding

embedding_layer = Embedding(vocab_size,
EMBEDDING_DIM,
weights=[embedding_matrix],
input_length=MAX_SEQUENCE_LENGTH,
trainable=False)


Where embedding_matrix is just a regular numpy matrix containing your weights.



For for examples you can also take a look here:
https://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html





Edit:



As @PavlinMavrodiev (see end of question) pointed out correctly the weights argument is deprecated. He instead used the layer method set_weights to set the weights instead:





  • layer.set_weights(weights): sets the weights of the layer from a list
    of Numpy arrays (with the same shapes as the output of get_weights).




To get trained weights get_weights can be used:





  • layer.get_weights(): returns the weights of the layer as a list of
    Numpy arrays.




Both are methods from the Keras Layer-Baseclass and can be used for all keras layers, including embeddings layer.






share|improve this answer



















  • 1




    I am accepting this as the solution to the question, although my implementation is slightly different. I've put the latter as an edit in the original question. The reason is that the weights argument to the Embedding layer appears to be outdated, although it works at the moment,. It is not mentioned in the latest Keras 2 documentation. I believe I've implemented a more future-proof version.
    – Pavlin Mavrodiev
    Nov 23 at 17:42












  • @PavlinMavrodiev thanks for sharing and the reply!
    – blue-phoenox
    Nov 23 at 17:59















up vote
2
down vote



accepted










Instead of using the embeddings_initializer argument of the Embedding layer you can load pre-trained weights for your embedding layer using the weights argument, this way you should be able to hand over pre-trained embeddings larger than 2GB.



Here is a short example:



from keras.layers import Embedding

embedding_layer = Embedding(vocab_size,
EMBEDDING_DIM,
weights=[embedding_matrix],
input_length=MAX_SEQUENCE_LENGTH,
trainable=False)


Where embedding_matrix is just a regular numpy matrix containing your weights.



For for examples you can also take a look here:
https://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html





Edit:



As @PavlinMavrodiev (see end of question) pointed out correctly the weights argument is deprecated. He instead used the layer method set_weights to set the weights instead:





  • layer.set_weights(weights): sets the weights of the layer from a list
    of Numpy arrays (with the same shapes as the output of get_weights).




To get trained weights get_weights can be used:





  • layer.get_weights(): returns the weights of the layer as a list of
    Numpy arrays.




Both are methods from the Keras Layer-Baseclass and can be used for all keras layers, including embeddings layer.






share|improve this answer



















  • 1




    I am accepting this as the solution to the question, although my implementation is slightly different. I've put the latter as an edit in the original question. The reason is that the weights argument to the Embedding layer appears to be outdated, although it works at the moment,. It is not mentioned in the latest Keras 2 documentation. I believe I've implemented a more future-proof version.
    – Pavlin Mavrodiev
    Nov 23 at 17:42












  • @PavlinMavrodiev thanks for sharing and the reply!
    – blue-phoenox
    Nov 23 at 17:59













up vote
2
down vote



accepted







up vote
2
down vote



accepted






Instead of using the embeddings_initializer argument of the Embedding layer you can load pre-trained weights for your embedding layer using the weights argument, this way you should be able to hand over pre-trained embeddings larger than 2GB.



Here is a short example:



from keras.layers import Embedding

embedding_layer = Embedding(vocab_size,
EMBEDDING_DIM,
weights=[embedding_matrix],
input_length=MAX_SEQUENCE_LENGTH,
trainable=False)


Where embedding_matrix is just a regular numpy matrix containing your weights.



For for examples you can also take a look here:
https://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html





Edit:



As @PavlinMavrodiev (see end of question) pointed out correctly the weights argument is deprecated. He instead used the layer method set_weights to set the weights instead:





  • layer.set_weights(weights): sets the weights of the layer from a list
    of Numpy arrays (with the same shapes as the output of get_weights).




To get trained weights get_weights can be used:





  • layer.get_weights(): returns the weights of the layer as a list of
    Numpy arrays.




Both are methods from the Keras Layer-Baseclass and can be used for all keras layers, including embeddings layer.






share|improve this answer














Instead of using the embeddings_initializer argument of the Embedding layer you can load pre-trained weights for your embedding layer using the weights argument, this way you should be able to hand over pre-trained embeddings larger than 2GB.



Here is a short example:



from keras.layers import Embedding

embedding_layer = Embedding(vocab_size,
EMBEDDING_DIM,
weights=[embedding_matrix],
input_length=MAX_SEQUENCE_LENGTH,
trainable=False)


Where embedding_matrix is just a regular numpy matrix containing your weights.



For for examples you can also take a look here:
https://blog.keras.io/using-pre-trained-word-embeddings-in-a-keras-model.html





Edit:



As @PavlinMavrodiev (see end of question) pointed out correctly the weights argument is deprecated. He instead used the layer method set_weights to set the weights instead:





  • layer.set_weights(weights): sets the weights of the layer from a list
    of Numpy arrays (with the same shapes as the output of get_weights).




To get trained weights get_weights can be used:





  • layer.get_weights(): returns the weights of the layer as a list of
    Numpy arrays.




Both are methods from the Keras Layer-Baseclass and can be used for all keras layers, including embeddings layer.







share|improve this answer














share|improve this answer



share|improve this answer








edited Nov 23 at 19:23

























answered Nov 22 at 16:11









blue-phoenox

3,44681440




3,44681440








  • 1




    I am accepting this as the solution to the question, although my implementation is slightly different. I've put the latter as an edit in the original question. The reason is that the weights argument to the Embedding layer appears to be outdated, although it works at the moment,. It is not mentioned in the latest Keras 2 documentation. I believe I've implemented a more future-proof version.
    – Pavlin Mavrodiev
    Nov 23 at 17:42












  • @PavlinMavrodiev thanks for sharing and the reply!
    – blue-phoenox
    Nov 23 at 17:59














  • 1




    I am accepting this as the solution to the question, although my implementation is slightly different. I've put the latter as an edit in the original question. The reason is that the weights argument to the Embedding layer appears to be outdated, although it works at the moment,. It is not mentioned in the latest Keras 2 documentation. I believe I've implemented a more future-proof version.
    – Pavlin Mavrodiev
    Nov 23 at 17:42












  • @PavlinMavrodiev thanks for sharing and the reply!
    – blue-phoenox
    Nov 23 at 17:59








1




1




I am accepting this as the solution to the question, although my implementation is slightly different. I've put the latter as an edit in the original question. The reason is that the weights argument to the Embedding layer appears to be outdated, although it works at the moment,. It is not mentioned in the latest Keras 2 documentation. I believe I've implemented a more future-proof version.
– Pavlin Mavrodiev
Nov 23 at 17:42






I am accepting this as the solution to the question, although my implementation is slightly different. I've put the latter as an edit in the original question. The reason is that the weights argument to the Embedding layer appears to be outdated, although it works at the moment,. It is not mentioned in the latest Keras 2 documentation. I believe I've implemented a more future-proof version.
– Pavlin Mavrodiev
Nov 23 at 17:42














@PavlinMavrodiev thanks for sharing and the reply!
– blue-phoenox
Nov 23 at 17:59




@PavlinMavrodiev thanks for sharing and the reply!
– blue-phoenox
Nov 23 at 17:59


















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53417537%2fkeras-initialize-large-embeddings-layer-with-pretrained-embeddings%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Contact image not getting when fetch all contact list from iPhone by CNContact

count number of partitions of a set with n elements into k subsets

A CLEAN and SIMPLE way to add appendices to Table of Contents and bookmarks